Jan 23 04:00:48 np0005593294 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 23 04:00:48 np0005593294 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 23 04:00:48 np0005593294 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 04:00:48 np0005593294 kernel: BIOS-provided physical RAM map:
Jan 23 04:00:48 np0005593294 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 23 04:00:48 np0005593294 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 23 04:00:48 np0005593294 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 23 04:00:48 np0005593294 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 23 04:00:48 np0005593294 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 23 04:00:48 np0005593294 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 23 04:00:48 np0005593294 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 23 04:00:48 np0005593294 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 23 04:00:48 np0005593294 kernel: NX (Execute Disable) protection: active
Jan 23 04:00:48 np0005593294 kernel: APIC: Static calls initialized
Jan 23 04:00:48 np0005593294 kernel: SMBIOS 2.8 present.
Jan 23 04:00:48 np0005593294 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 23 04:00:48 np0005593294 kernel: Hypervisor detected: KVM
Jan 23 04:00:48 np0005593294 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 23 04:00:48 np0005593294 kernel: kvm-clock: using sched offset of 3915146415 cycles
Jan 23 04:00:48 np0005593294 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 23 04:00:48 np0005593294 kernel: tsc: Detected 2800.000 MHz processor
Jan 23 04:00:48 np0005593294 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 23 04:00:48 np0005593294 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 23 04:00:48 np0005593294 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 23 04:00:48 np0005593294 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 23 04:00:48 np0005593294 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 23 04:00:48 np0005593294 kernel: Using GB pages for direct mapping
Jan 23 04:00:48 np0005593294 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 23 04:00:48 np0005593294 kernel: ACPI: Early table checksum verification disabled
Jan 23 04:00:48 np0005593294 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 23 04:00:48 np0005593294 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:48 np0005593294 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:48 np0005593294 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:48 np0005593294 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 23 04:00:48 np0005593294 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:48 np0005593294 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:48 np0005593294 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 23 04:00:48 np0005593294 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 23 04:00:48 np0005593294 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 23 04:00:48 np0005593294 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 23 04:00:48 np0005593294 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 23 04:00:48 np0005593294 kernel: No NUMA configuration found
Jan 23 04:00:48 np0005593294 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 23 04:00:48 np0005593294 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 23 04:00:48 np0005593294 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 23 04:00:48 np0005593294 kernel: Zone ranges:
Jan 23 04:00:48 np0005593294 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 23 04:00:48 np0005593294 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 23 04:00:48 np0005593294 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 04:00:48 np0005593294 kernel:  Device   empty
Jan 23 04:00:48 np0005593294 kernel: Movable zone start for each node
Jan 23 04:00:48 np0005593294 kernel: Early memory node ranges
Jan 23 04:00:48 np0005593294 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 23 04:00:48 np0005593294 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 23 04:00:48 np0005593294 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 04:00:48 np0005593294 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 23 04:00:48 np0005593294 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 23 04:00:48 np0005593294 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 23 04:00:48 np0005593294 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 23 04:00:48 np0005593294 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 23 04:00:48 np0005593294 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 23 04:00:48 np0005593294 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 23 04:00:48 np0005593294 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 23 04:00:48 np0005593294 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 23 04:00:48 np0005593294 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 23 04:00:48 np0005593294 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 23 04:00:48 np0005593294 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 23 04:00:48 np0005593294 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 23 04:00:48 np0005593294 kernel: TSC deadline timer available
Jan 23 04:00:48 np0005593294 kernel: CPU topo: Max. logical packages:   8
Jan 23 04:00:48 np0005593294 kernel: CPU topo: Max. logical dies:       8
Jan 23 04:00:48 np0005593294 kernel: CPU topo: Max. dies per package:   1
Jan 23 04:00:48 np0005593294 kernel: CPU topo: Max. threads per core:   1
Jan 23 04:00:48 np0005593294 kernel: CPU topo: Num. cores per package:     1
Jan 23 04:00:48 np0005593294 kernel: CPU topo: Num. threads per package:   1
Jan 23 04:00:48 np0005593294 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 23 04:00:48 np0005593294 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 23 04:00:48 np0005593294 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 23 04:00:48 np0005593294 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 23 04:00:48 np0005593294 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 23 04:00:48 np0005593294 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 23 04:00:48 np0005593294 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 23 04:00:48 np0005593294 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 23 04:00:48 np0005593294 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 23 04:00:48 np0005593294 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 23 04:00:48 np0005593294 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 23 04:00:48 np0005593294 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 23 04:00:48 np0005593294 kernel: Booting paravirtualized kernel on KVM
Jan 23 04:00:48 np0005593294 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 23 04:00:48 np0005593294 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 23 04:00:48 np0005593294 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 23 04:00:48 np0005593294 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 23 04:00:48 np0005593294 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 04:00:48 np0005593294 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 23 04:00:48 np0005593294 kernel: random: crng init done
Jan 23 04:00:48 np0005593294 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: Fallback order for Node 0: 0 
Jan 23 04:00:48 np0005593294 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 23 04:00:48 np0005593294 kernel: Policy zone: Normal
Jan 23 04:00:48 np0005593294 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 23 04:00:48 np0005593294 kernel: software IO TLB: area num 8.
Jan 23 04:00:48 np0005593294 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 23 04:00:48 np0005593294 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 23 04:00:48 np0005593294 kernel: ftrace: allocated 194 pages with 3 groups
Jan 23 04:00:48 np0005593294 kernel: Dynamic Preempt: voluntary
Jan 23 04:00:48 np0005593294 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 23 04:00:48 np0005593294 kernel: rcu: #011RCU event tracing is enabled.
Jan 23 04:00:48 np0005593294 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 23 04:00:48 np0005593294 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 23 04:00:48 np0005593294 kernel: #011Rude variant of Tasks RCU enabled.
Jan 23 04:00:48 np0005593294 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 23 04:00:48 np0005593294 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 23 04:00:48 np0005593294 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 23 04:00:48 np0005593294 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 04:00:48 np0005593294 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 04:00:48 np0005593294 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 04:00:48 np0005593294 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 23 04:00:48 np0005593294 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 23 04:00:48 np0005593294 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 23 04:00:48 np0005593294 kernel: Console: colour VGA+ 80x25
Jan 23 04:00:48 np0005593294 kernel: printk: console [ttyS0] enabled
Jan 23 04:00:48 np0005593294 kernel: ACPI: Core revision 20230331
Jan 23 04:00:48 np0005593294 kernel: APIC: Switch to symmetric I/O mode setup
Jan 23 04:00:48 np0005593294 kernel: x2apic enabled
Jan 23 04:00:48 np0005593294 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 23 04:00:48 np0005593294 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 23 04:00:48 np0005593294 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 23 04:00:48 np0005593294 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 23 04:00:48 np0005593294 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 23 04:00:48 np0005593294 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 23 04:00:48 np0005593294 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 23 04:00:48 np0005593294 kernel: Spectre V2 : Mitigation: Retpolines
Jan 23 04:00:48 np0005593294 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 23 04:00:48 np0005593294 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 23 04:00:48 np0005593294 kernel: RETBleed: Mitigation: untrained return thunk
Jan 23 04:00:48 np0005593294 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 23 04:00:48 np0005593294 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 23 04:00:48 np0005593294 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 23 04:00:48 np0005593294 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 23 04:00:48 np0005593294 kernel: x86/bugs: return thunk changed
Jan 23 04:00:48 np0005593294 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 23 04:00:48 np0005593294 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 23 04:00:48 np0005593294 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 23 04:00:48 np0005593294 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 23 04:00:48 np0005593294 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 23 04:00:48 np0005593294 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 23 04:00:48 np0005593294 kernel: Freeing SMP alternatives memory: 40K
Jan 23 04:00:48 np0005593294 kernel: pid_max: default: 32768 minimum: 301
Jan 23 04:00:48 np0005593294 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 23 04:00:48 np0005593294 kernel: landlock: Up and running.
Jan 23 04:00:48 np0005593294 kernel: Yama: becoming mindful.
Jan 23 04:00:48 np0005593294 kernel: SELinux:  Initializing.
Jan 23 04:00:48 np0005593294 kernel: LSM support for eBPF active
Jan 23 04:00:48 np0005593294 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 23 04:00:48 np0005593294 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 23 04:00:48 np0005593294 kernel: ... version:                0
Jan 23 04:00:48 np0005593294 kernel: ... bit width:              48
Jan 23 04:00:48 np0005593294 kernel: ... generic registers:      6
Jan 23 04:00:48 np0005593294 kernel: ... value mask:             0000ffffffffffff
Jan 23 04:00:48 np0005593294 kernel: ... max period:             00007fffffffffff
Jan 23 04:00:48 np0005593294 kernel: ... fixed-purpose events:   0
Jan 23 04:00:48 np0005593294 kernel: ... event mask:             000000000000003f
Jan 23 04:00:48 np0005593294 kernel: signal: max sigframe size: 1776
Jan 23 04:00:48 np0005593294 kernel: rcu: Hierarchical SRCU implementation.
Jan 23 04:00:48 np0005593294 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 23 04:00:48 np0005593294 kernel: smp: Bringing up secondary CPUs ...
Jan 23 04:00:48 np0005593294 kernel: smpboot: x86: Booting SMP configuration:
Jan 23 04:00:48 np0005593294 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 23 04:00:48 np0005593294 kernel: smp: Brought up 1 node, 8 CPUs
Jan 23 04:00:48 np0005593294 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 23 04:00:48 np0005593294 kernel: node 0 deferred pages initialised in 10ms
Jan 23 04:00:48 np0005593294 kernel: Memory: 7763888K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 23 04:00:48 np0005593294 kernel: devtmpfs: initialized
Jan 23 04:00:48 np0005593294 kernel: x86/mm: Memory block size: 128MB
Jan 23 04:00:48 np0005593294 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 23 04:00:48 np0005593294 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 23 04:00:48 np0005593294 kernel: pinctrl core: initialized pinctrl subsystem
Jan 23 04:00:48 np0005593294 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 23 04:00:48 np0005593294 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 23 04:00:48 np0005593294 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 23 04:00:48 np0005593294 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 23 04:00:48 np0005593294 kernel: audit: initializing netlink subsys (disabled)
Jan 23 04:00:48 np0005593294 kernel: audit: type=2000 audit(1769158846.456:1): state=initialized audit_enabled=0 res=1
Jan 23 04:00:48 np0005593294 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 23 04:00:48 np0005593294 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 23 04:00:48 np0005593294 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 23 04:00:48 np0005593294 kernel: cpuidle: using governor menu
Jan 23 04:00:48 np0005593294 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 23 04:00:48 np0005593294 kernel: PCI: Using configuration type 1 for base access
Jan 23 04:00:48 np0005593294 kernel: PCI: Using configuration type 1 for extended access
Jan 23 04:00:48 np0005593294 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 23 04:00:48 np0005593294 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 23 04:00:48 np0005593294 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 23 04:00:48 np0005593294 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 23 04:00:48 np0005593294 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 23 04:00:48 np0005593294 kernel: Demotion targets for Node 0: null
Jan 23 04:00:48 np0005593294 kernel: cryptd: max_cpu_qlen set to 1000
Jan 23 04:00:48 np0005593294 kernel: ACPI: Added _OSI(Module Device)
Jan 23 04:00:48 np0005593294 kernel: ACPI: Added _OSI(Processor Device)
Jan 23 04:00:48 np0005593294 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 23 04:00:48 np0005593294 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 23 04:00:48 np0005593294 kernel: ACPI: Interpreter enabled
Jan 23 04:00:48 np0005593294 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 23 04:00:48 np0005593294 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 23 04:00:48 np0005593294 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 23 04:00:48 np0005593294 kernel: PCI: Using E820 reservations for host bridge windows
Jan 23 04:00:48 np0005593294 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 23 04:00:48 np0005593294 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 23 04:00:48 np0005593294 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [3] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [4] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [5] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [6] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [7] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [8] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [9] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [10] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [11] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [12] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [13] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [14] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [15] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [16] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [17] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [18] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [19] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [20] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [21] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [22] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [23] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [24] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [25] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [26] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [27] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [28] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [29] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [30] registered
Jan 23 04:00:48 np0005593294 kernel: acpiphp: Slot [31] registered
Jan 23 04:00:48 np0005593294 kernel: PCI host bridge to bus 0000:00
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 23 04:00:48 np0005593294 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 23 04:00:48 np0005593294 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 23 04:00:48 np0005593294 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 23 04:00:48 np0005593294 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 23 04:00:48 np0005593294 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 23 04:00:48 np0005593294 kernel: iommu: Default domain type: Translated
Jan 23 04:00:48 np0005593294 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 23 04:00:48 np0005593294 kernel: SCSI subsystem initialized
Jan 23 04:00:48 np0005593294 kernel: ACPI: bus type USB registered
Jan 23 04:00:48 np0005593294 kernel: usbcore: registered new interface driver usbfs
Jan 23 04:00:48 np0005593294 kernel: usbcore: registered new interface driver hub
Jan 23 04:00:48 np0005593294 kernel: usbcore: registered new device driver usb
Jan 23 04:00:48 np0005593294 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 23 04:00:48 np0005593294 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 23 04:00:48 np0005593294 kernel: PTP clock support registered
Jan 23 04:00:48 np0005593294 kernel: EDAC MC: Ver: 3.0.0
Jan 23 04:00:48 np0005593294 kernel: NetLabel: Initializing
Jan 23 04:00:48 np0005593294 kernel: NetLabel:  domain hash size = 128
Jan 23 04:00:48 np0005593294 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 23 04:00:48 np0005593294 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 23 04:00:48 np0005593294 kernel: PCI: Using ACPI for IRQ routing
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 23 04:00:48 np0005593294 kernel: vgaarb: loaded
Jan 23 04:00:48 np0005593294 kernel: clocksource: Switched to clocksource kvm-clock
Jan 23 04:00:48 np0005593294 kernel: VFS: Disk quotas dquot_6.6.0
Jan 23 04:00:48 np0005593294 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 23 04:00:48 np0005593294 kernel: pnp: PnP ACPI init
Jan 23 04:00:48 np0005593294 kernel: pnp: PnP ACPI: found 5 devices
Jan 23 04:00:48 np0005593294 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 23 04:00:48 np0005593294 kernel: NET: Registered PF_INET protocol family
Jan 23 04:00:48 np0005593294 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 23 04:00:48 np0005593294 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 04:00:48 np0005593294 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 23 04:00:48 np0005593294 kernel: NET: Registered PF_XDP protocol family
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 23 04:00:48 np0005593294 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 23 04:00:48 np0005593294 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 23 04:00:48 np0005593294 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 77158 usecs
Jan 23 04:00:48 np0005593294 kernel: PCI: CLS 0 bytes, default 64
Jan 23 04:00:48 np0005593294 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 23 04:00:48 np0005593294 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 23 04:00:48 np0005593294 kernel: Trying to unpack rootfs image as initramfs...
Jan 23 04:00:48 np0005593294 kernel: ACPI: bus type thunderbolt registered
Jan 23 04:00:48 np0005593294 kernel: Initialise system trusted keyrings
Jan 23 04:00:48 np0005593294 kernel: Key type blacklist registered
Jan 23 04:00:48 np0005593294 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 23 04:00:48 np0005593294 kernel: zbud: loaded
Jan 23 04:00:48 np0005593294 kernel: integrity: Platform Keyring initialized
Jan 23 04:00:48 np0005593294 kernel: integrity: Machine keyring initialized
Jan 23 04:00:48 np0005593294 kernel: Freeing initrd memory: 87956K
Jan 23 04:00:48 np0005593294 kernel: NET: Registered PF_ALG protocol family
Jan 23 04:00:48 np0005593294 kernel: xor: automatically using best checksumming function   avx       
Jan 23 04:00:48 np0005593294 kernel: Key type asymmetric registered
Jan 23 04:00:48 np0005593294 kernel: Asymmetric key parser 'x509' registered
Jan 23 04:00:48 np0005593294 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 23 04:00:48 np0005593294 kernel: io scheduler mq-deadline registered
Jan 23 04:00:48 np0005593294 kernel: io scheduler kyber registered
Jan 23 04:00:48 np0005593294 kernel: io scheduler bfq registered
Jan 23 04:00:48 np0005593294 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 23 04:00:48 np0005593294 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 23 04:00:48 np0005593294 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 23 04:00:48 np0005593294 kernel: ACPI: button: Power Button [PWRF]
Jan 23 04:00:48 np0005593294 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 23 04:00:48 np0005593294 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 23 04:00:48 np0005593294 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 23 04:00:48 np0005593294 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 23 04:00:48 np0005593294 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 23 04:00:48 np0005593294 kernel: Non-volatile memory driver v1.3
Jan 23 04:00:48 np0005593294 kernel: rdac: device handler registered
Jan 23 04:00:48 np0005593294 kernel: hp_sw: device handler registered
Jan 23 04:00:48 np0005593294 kernel: emc: device handler registered
Jan 23 04:00:48 np0005593294 kernel: alua: device handler registered
Jan 23 04:00:48 np0005593294 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 23 04:00:48 np0005593294 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 23 04:00:48 np0005593294 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 23 04:00:48 np0005593294 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 23 04:00:48 np0005593294 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 23 04:00:48 np0005593294 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 23 04:00:48 np0005593294 kernel: usb usb1: Product: UHCI Host Controller
Jan 23 04:00:48 np0005593294 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 23 04:00:48 np0005593294 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 23 04:00:48 np0005593294 kernel: hub 1-0:1.0: USB hub found
Jan 23 04:00:48 np0005593294 kernel: hub 1-0:1.0: 2 ports detected
Jan 23 04:00:48 np0005593294 kernel: usbcore: registered new interface driver usbserial_generic
Jan 23 04:00:48 np0005593294 kernel: usbserial: USB Serial support registered for generic
Jan 23 04:00:48 np0005593294 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 23 04:00:48 np0005593294 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 23 04:00:48 np0005593294 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 23 04:00:48 np0005593294 kernel: mousedev: PS/2 mouse device common for all mice
Jan 23 04:00:48 np0005593294 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 23 04:00:48 np0005593294 kernel: rtc_cmos 00:04: registered as rtc0
Jan 23 04:00:48 np0005593294 kernel: rtc_cmos 00:04: setting system clock to 2026-01-23T09:00:47 UTC (1769158847)
Jan 23 04:00:48 np0005593294 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 23 04:00:48 np0005593294 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 23 04:00:48 np0005593294 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 23 04:00:48 np0005593294 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 23 04:00:48 np0005593294 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 23 04:00:48 np0005593294 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 23 04:00:48 np0005593294 kernel: usbcore: registered new interface driver usbhid
Jan 23 04:00:48 np0005593294 kernel: usbhid: USB HID core driver
Jan 23 04:00:48 np0005593294 kernel: drop_monitor: Initializing network drop monitor service
Jan 23 04:00:48 np0005593294 kernel: Initializing XFRM netlink socket
Jan 23 04:00:48 np0005593294 kernel: NET: Registered PF_INET6 protocol family
Jan 23 04:00:48 np0005593294 kernel: Segment Routing with IPv6
Jan 23 04:00:48 np0005593294 kernel: NET: Registered PF_PACKET protocol family
Jan 23 04:00:48 np0005593294 kernel: mpls_gso: MPLS GSO support
Jan 23 04:00:48 np0005593294 kernel: IPI shorthand broadcast: enabled
Jan 23 04:00:48 np0005593294 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 23 04:00:48 np0005593294 kernel: AES CTR mode by8 optimization enabled
Jan 23 04:00:48 np0005593294 kernel: sched_clock: Marking stable (2011002213, 152216626)->(2321699301, -158480462)
Jan 23 04:00:48 np0005593294 kernel: registered taskstats version 1
Jan 23 04:00:48 np0005593294 kernel: Loading compiled-in X.509 certificates
Jan 23 04:00:48 np0005593294 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 04:00:48 np0005593294 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 23 04:00:48 np0005593294 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 23 04:00:48 np0005593294 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 23 04:00:48 np0005593294 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 23 04:00:48 np0005593294 kernel: Demotion targets for Node 0: null
Jan 23 04:00:48 np0005593294 kernel: page_owner is disabled
Jan 23 04:00:48 np0005593294 kernel: Key type .fscrypt registered
Jan 23 04:00:48 np0005593294 kernel: Key type fscrypt-provisioning registered
Jan 23 04:00:48 np0005593294 kernel: Key type big_key registered
Jan 23 04:00:48 np0005593294 kernel: Key type encrypted registered
Jan 23 04:00:48 np0005593294 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 23 04:00:48 np0005593294 kernel: Loading compiled-in module X.509 certificates
Jan 23 04:00:48 np0005593294 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 04:00:48 np0005593294 kernel: ima: Allocated hash algorithm: sha256
Jan 23 04:00:48 np0005593294 kernel: ima: No architecture policies found
Jan 23 04:00:48 np0005593294 kernel: evm: Initialising EVM extended attributes:
Jan 23 04:00:48 np0005593294 kernel: evm: security.selinux
Jan 23 04:00:48 np0005593294 kernel: evm: security.SMACK64 (disabled)
Jan 23 04:00:48 np0005593294 kernel: evm: security.SMACK64EXEC (disabled)
Jan 23 04:00:48 np0005593294 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 23 04:00:48 np0005593294 kernel: evm: security.SMACK64MMAP (disabled)
Jan 23 04:00:48 np0005593294 kernel: evm: security.apparmor (disabled)
Jan 23 04:00:48 np0005593294 kernel: evm: security.ima
Jan 23 04:00:48 np0005593294 kernel: evm: security.capability
Jan 23 04:00:48 np0005593294 kernel: evm: HMAC attrs: 0x1
Jan 23 04:00:48 np0005593294 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 23 04:00:48 np0005593294 kernel: Running certificate verification RSA selftest
Jan 23 04:00:48 np0005593294 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 23 04:00:48 np0005593294 kernel: Running certificate verification ECDSA selftest
Jan 23 04:00:48 np0005593294 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 23 04:00:48 np0005593294 kernel: clk: Disabling unused clocks
Jan 23 04:00:48 np0005593294 kernel: Freeing unused decrypted memory: 2028K
Jan 23 04:00:48 np0005593294 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 23 04:00:48 np0005593294 kernel: Write protecting the kernel read-only data: 30720k
Jan 23 04:00:48 np0005593294 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 23 04:00:48 np0005593294 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 23 04:00:48 np0005593294 kernel: Run /init as init process
Jan 23 04:00:48 np0005593294 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 04:00:48 np0005593294 systemd: Detected virtualization kvm.
Jan 23 04:00:48 np0005593294 systemd: Detected architecture x86-64.
Jan 23 04:00:48 np0005593294 systemd: Running in initrd.
Jan 23 04:00:48 np0005593294 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 23 04:00:48 np0005593294 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 23 04:00:48 np0005593294 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 23 04:00:48 np0005593294 kernel: usb 1-1: Manufacturer: QEMU
Jan 23 04:00:48 np0005593294 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 23 04:00:48 np0005593294 systemd: No hostname configured, using default hostname.
Jan 23 04:00:48 np0005593294 systemd: Hostname set to <localhost>.
Jan 23 04:00:48 np0005593294 systemd: Initializing machine ID from VM UUID.
Jan 23 04:00:48 np0005593294 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 23 04:00:48 np0005593294 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 23 04:00:48 np0005593294 systemd: Queued start job for default target Initrd Default Target.
Jan 23 04:00:48 np0005593294 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 04:00:48 np0005593294 systemd: Reached target Local Encrypted Volumes.
Jan 23 04:00:48 np0005593294 systemd: Reached target Initrd /usr File System.
Jan 23 04:00:48 np0005593294 systemd: Reached target Local File Systems.
Jan 23 04:00:48 np0005593294 systemd: Reached target Path Units.
Jan 23 04:00:48 np0005593294 systemd: Reached target Slice Units.
Jan 23 04:00:48 np0005593294 systemd: Reached target Swaps.
Jan 23 04:00:48 np0005593294 systemd: Reached target Timer Units.
Jan 23 04:00:48 np0005593294 systemd: Listening on D-Bus System Message Bus Socket.
Jan 23 04:00:48 np0005593294 systemd: Listening on Journal Socket (/dev/log).
Jan 23 04:00:48 np0005593294 systemd: Listening on Journal Socket.
Jan 23 04:00:48 np0005593294 systemd: Listening on udev Control Socket.
Jan 23 04:00:48 np0005593294 systemd: Listening on udev Kernel Socket.
Jan 23 04:00:48 np0005593294 systemd: Reached target Socket Units.
Jan 23 04:00:48 np0005593294 systemd: Starting Create List of Static Device Nodes...
Jan 23 04:00:48 np0005593294 systemd: Starting Journal Service...
Jan 23 04:00:48 np0005593294 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 04:00:48 np0005593294 systemd: Starting Apply Kernel Variables...
Jan 23 04:00:48 np0005593294 systemd: Starting Create System Users...
Jan 23 04:00:48 np0005593294 systemd: Starting Setup Virtual Console...
Jan 23 04:00:48 np0005593294 systemd: Finished Create List of Static Device Nodes.
Jan 23 04:00:48 np0005593294 systemd: Finished Apply Kernel Variables.
Jan 23 04:00:48 np0005593294 systemd: Finished Create System Users.
Jan 23 04:00:48 np0005593294 systemd-journald[307]: Journal started
Jan 23 04:00:48 np0005593294 systemd-journald[307]: Runtime Journal (/run/log/journal/53821a391f4a4bf2b036ba3044ea8780) is 8.0M, max 153.6M, 145.6M free.
Jan 23 04:00:48 np0005593294 systemd-sysusers[312]: Creating group 'users' with GID 100.
Jan 23 04:00:48 np0005593294 systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Jan 23 04:00:48 np0005593294 systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 23 04:00:48 np0005593294 systemd: Starting Create Static Device Nodes in /dev...
Jan 23 04:00:48 np0005593294 systemd: Started Journal Service.
Jan 23 04:00:48 np0005593294 systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 04:00:48 np0005593294 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 04:00:48 np0005593294 systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 04:00:48 np0005593294 systemd[1]: Finished Setup Virtual Console.
Jan 23 04:00:48 np0005593294 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 23 04:00:48 np0005593294 systemd[1]: Starting dracut cmdline hook...
Jan 23 04:00:48 np0005593294 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Jan 23 04:00:48 np0005593294 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 04:00:48 np0005593294 systemd[1]: Finished dracut cmdline hook.
Jan 23 04:00:48 np0005593294 systemd[1]: Starting dracut pre-udev hook...
Jan 23 04:00:48 np0005593294 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 23 04:00:48 np0005593294 kernel: device-mapper: uevent: version 1.0.3
Jan 23 04:00:48 np0005593294 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 23 04:00:48 np0005593294 kernel: RPC: Registered named UNIX socket transport module.
Jan 23 04:00:48 np0005593294 kernel: RPC: Registered udp transport module.
Jan 23 04:00:48 np0005593294 kernel: RPC: Registered tcp transport module.
Jan 23 04:00:48 np0005593294 kernel: RPC: Registered tcp-with-tls transport module.
Jan 23 04:00:48 np0005593294 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 23 04:00:48 np0005593294 rpc.statd[443]: Version 2.5.4 starting
Jan 23 04:00:48 np0005593294 rpc.statd[443]: Initializing NSM state
Jan 23 04:00:48 np0005593294 rpc.idmapd[448]: Setting log level to 0
Jan 23 04:00:48 np0005593294 systemd[1]: Finished dracut pre-udev hook.
Jan 23 04:00:48 np0005593294 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 04:00:48 np0005593294 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 04:00:48 np0005593294 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 04:00:48 np0005593294 systemd[1]: Starting dracut pre-trigger hook...
Jan 23 04:00:48 np0005593294 systemd[1]: Finished dracut pre-trigger hook.
Jan 23 04:00:48 np0005593294 systemd[1]: Starting Coldplug All udev Devices...
Jan 23 04:00:48 np0005593294 systemd[1]: Created slice Slice /system/modprobe.
Jan 23 04:00:49 np0005593294 systemd[1]: Starting Load Kernel Module configfs...
Jan 23 04:00:49 np0005593294 systemd[1]: Finished Coldplug All udev Devices.
Jan 23 04:00:49 np0005593294 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 04:00:49 np0005593294 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 04:00:49 np0005593294 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 04:00:49 np0005593294 systemd[1]: Reached target Network.
Jan 23 04:00:49 np0005593294 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 04:00:49 np0005593294 systemd[1]: Starting dracut initqueue hook...
Jan 23 04:00:49 np0005593294 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 23 04:00:49 np0005593294 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 23 04:00:49 np0005593294 kernel: vda: vda1
Jan 23 04:00:49 np0005593294 systemd[1]: Mounting Kernel Configuration File System...
Jan 23 04:00:49 np0005593294 systemd[1]: Mounted Kernel Configuration File System.
Jan 23 04:00:49 np0005593294 systemd[1]: Reached target System Initialization.
Jan 23 04:00:49 np0005593294 systemd[1]: Reached target Basic System.
Jan 23 04:00:49 np0005593294 systemd-udevd[492]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:00:49 np0005593294 kernel: scsi host0: ata_piix
Jan 23 04:00:49 np0005593294 kernel: scsi host1: ata_piix
Jan 23 04:00:49 np0005593294 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 23 04:00:49 np0005593294 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 23 04:00:49 np0005593294 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 04:00:49 np0005593294 systemd[1]: Reached target Initrd Root Device.
Jan 23 04:00:49 np0005593294 kernel: ata1: found unknown device (class 0)
Jan 23 04:00:49 np0005593294 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 23 04:00:49 np0005593294 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 23 04:00:49 np0005593294 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 23 04:00:49 np0005593294 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 23 04:00:49 np0005593294 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 23 04:00:49 np0005593294 systemd[1]: Finished dracut initqueue hook.
Jan 23 04:00:49 np0005593294 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 04:00:49 np0005593294 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 23 04:00:49 np0005593294 systemd[1]: Reached target Remote File Systems.
Jan 23 04:00:49 np0005593294 systemd[1]: Starting dracut pre-mount hook...
Jan 23 04:00:49 np0005593294 systemd[1]: Finished dracut pre-mount hook.
Jan 23 04:00:49 np0005593294 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 23 04:00:49 np0005593294 systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Jan 23 04:00:49 np0005593294 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 04:00:49 np0005593294 systemd[1]: Mounting /sysroot...
Jan 23 04:00:49 np0005593294 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 23 04:00:49 np0005593294 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 23 04:00:50 np0005593294 kernel: XFS (vda1): Ending clean mount
Jan 23 04:00:50 np0005593294 systemd[1]: Mounted /sysroot.
Jan 23 04:00:50 np0005593294 systemd[1]: Reached target Initrd Root File System.
Jan 23 04:00:50 np0005593294 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 23 04:00:50 np0005593294 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 23 04:00:50 np0005593294 systemd[1]: Reached target Initrd File Systems.
Jan 23 04:00:50 np0005593294 systemd[1]: Reached target Initrd Default Target.
Jan 23 04:00:50 np0005593294 systemd[1]: Starting dracut mount hook...
Jan 23 04:00:50 np0005593294 systemd[1]: Finished dracut mount hook.
Jan 23 04:00:50 np0005593294 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 23 04:00:50 np0005593294 rpc.idmapd[448]: exiting on signal 15
Jan 23 04:00:50 np0005593294 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 23 04:00:50 np0005593294 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Network.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Timer Units.
Jan 23 04:00:50 np0005593294 systemd[1]: dbus.socket: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 23 04:00:50 np0005593294 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Initrd Default Target.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Basic System.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Initrd Root Device.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Initrd /usr File System.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Path Units.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Remote File Systems.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Slice Units.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Socket Units.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target System Initialization.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Local File Systems.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Swaps.
Jan 23 04:00:50 np0005593294 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped dracut mount hook.
Jan 23 04:00:50 np0005593294 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped dracut pre-mount hook.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 23 04:00:50 np0005593294 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped dracut initqueue hook.
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped Coldplug All udev Devices.
Jan 23 04:00:50 np0005593294 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped dracut pre-trigger hook.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped Setup Virtual Console.
Jan 23 04:00:50 np0005593294 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Closed udev Control Socket.
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Closed udev Kernel Socket.
Jan 23 04:00:50 np0005593294 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped dracut pre-udev hook.
Jan 23 04:00:50 np0005593294 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped dracut cmdline hook.
Jan 23 04:00:50 np0005593294 systemd[1]: Starting Cleanup udev Database...
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 23 04:00:50 np0005593294 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 23 04:00:50 np0005593294 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Stopped Create System Users.
Jan 23 04:00:50 np0005593294 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 23 04:00:50 np0005593294 systemd[1]: Finished Cleanup udev Database.
Jan 23 04:00:50 np0005593294 systemd[1]: Reached target Switch Root.
Jan 23 04:00:50 np0005593294 systemd[1]: Starting Switch Root...
Jan 23 04:00:50 np0005593294 systemd[1]: Switching root.
Jan 23 04:00:50 np0005593294 systemd-journald[307]: Journal stopped
Jan 23 04:00:51 np0005593294 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 23 04:00:51 np0005593294 kernel: audit: type=1404 audit(1769158850.837:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 23 04:00:51 np0005593294 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:00:51 np0005593294 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:00:51 np0005593294 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:00:51 np0005593294 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:00:51 np0005593294 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:00:51 np0005593294 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:00:51 np0005593294 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:00:51 np0005593294 kernel: audit: type=1403 audit(1769158850.970:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 23 04:00:51 np0005593294 systemd: Successfully loaded SELinux policy in 137.140ms.
Jan 23 04:00:51 np0005593294 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.224ms.
Jan 23 04:00:51 np0005593294 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 04:00:51 np0005593294 systemd: Detected virtualization kvm.
Jan 23 04:00:51 np0005593294 systemd: Detected architecture x86-64.
Jan 23 04:00:51 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:00:51 np0005593294 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 23 04:00:51 np0005593294 systemd: Stopped Switch Root.
Jan 23 04:00:51 np0005593294 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 23 04:00:51 np0005593294 systemd: Created slice Slice /system/getty.
Jan 23 04:00:51 np0005593294 systemd: Created slice Slice /system/serial-getty.
Jan 23 04:00:51 np0005593294 systemd: Created slice Slice /system/sshd-keygen.
Jan 23 04:00:51 np0005593294 systemd: Created slice User and Session Slice.
Jan 23 04:00:51 np0005593294 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 04:00:51 np0005593294 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 23 04:00:51 np0005593294 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 23 04:00:51 np0005593294 systemd: Reached target Local Encrypted Volumes.
Jan 23 04:00:51 np0005593294 systemd: Stopped target Switch Root.
Jan 23 04:00:51 np0005593294 systemd: Stopped target Initrd File Systems.
Jan 23 04:00:51 np0005593294 systemd: Stopped target Initrd Root File System.
Jan 23 04:00:51 np0005593294 systemd: Reached target Local Integrity Protected Volumes.
Jan 23 04:00:51 np0005593294 systemd: Reached target Path Units.
Jan 23 04:00:51 np0005593294 systemd: Reached target rpc_pipefs.target.
Jan 23 04:00:51 np0005593294 systemd: Reached target Slice Units.
Jan 23 04:00:51 np0005593294 systemd: Reached target Swaps.
Jan 23 04:00:51 np0005593294 systemd: Reached target Local Verity Protected Volumes.
Jan 23 04:00:51 np0005593294 systemd: Listening on RPCbind Server Activation Socket.
Jan 23 04:00:51 np0005593294 systemd: Reached target RPC Port Mapper.
Jan 23 04:00:51 np0005593294 systemd: Listening on Process Core Dump Socket.
Jan 23 04:00:51 np0005593294 systemd: Listening on initctl Compatibility Named Pipe.
Jan 23 04:00:51 np0005593294 systemd: Listening on udev Control Socket.
Jan 23 04:00:51 np0005593294 systemd: Listening on udev Kernel Socket.
Jan 23 04:00:51 np0005593294 systemd: Mounting Huge Pages File System...
Jan 23 04:00:51 np0005593294 systemd: Mounting POSIX Message Queue File System...
Jan 23 04:00:51 np0005593294 systemd: Mounting Kernel Debug File System...
Jan 23 04:00:51 np0005593294 systemd: Mounting Kernel Trace File System...
Jan 23 04:00:51 np0005593294 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 04:00:51 np0005593294 systemd: Starting Create List of Static Device Nodes...
Jan 23 04:00:51 np0005593294 systemd: Starting Load Kernel Module configfs...
Jan 23 04:00:51 np0005593294 systemd: Starting Load Kernel Module drm...
Jan 23 04:00:51 np0005593294 systemd: Starting Load Kernel Module efi_pstore...
Jan 23 04:00:51 np0005593294 systemd: Starting Load Kernel Module fuse...
Jan 23 04:00:51 np0005593294 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 23 04:00:51 np0005593294 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 23 04:00:51 np0005593294 systemd: Stopped File System Check on Root Device.
Jan 23 04:00:51 np0005593294 systemd: Stopped Journal Service.
Jan 23 04:00:51 np0005593294 systemd: Starting Journal Service...
Jan 23 04:00:51 np0005593294 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 04:00:51 np0005593294 kernel: fuse: init (API version 7.37)
Jan 23 04:00:51 np0005593294 systemd: Starting Generate network units from Kernel command line...
Jan 23 04:00:51 np0005593294 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 04:00:51 np0005593294 systemd: Starting Remount Root and Kernel File Systems...
Jan 23 04:00:51 np0005593294 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 23 04:00:51 np0005593294 systemd: Starting Apply Kernel Variables...
Jan 23 04:00:51 np0005593294 systemd: Starting Coldplug All udev Devices...
Jan 23 04:00:51 np0005593294 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 23 04:00:51 np0005593294 systemd: Mounted Huge Pages File System.
Jan 23 04:00:51 np0005593294 systemd: Mounted POSIX Message Queue File System.
Jan 23 04:00:51 np0005593294 systemd: Mounted Kernel Debug File System.
Jan 23 04:00:51 np0005593294 systemd: Mounted Kernel Trace File System.
Jan 23 04:00:51 np0005593294 systemd: Finished Create List of Static Device Nodes.
Jan 23 04:00:51 np0005593294 systemd-journald[677]: Journal started
Jan 23 04:00:51 np0005593294 systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 04:00:51 np0005593294 systemd[1]: Queued start job for default target Multi-User System.
Jan 23 04:00:51 np0005593294 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 23 04:00:51 np0005593294 systemd: Started Journal Service.
Jan 23 04:00:51 np0005593294 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 04:00:51 np0005593294 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 23 04:00:51 np0005593294 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Load Kernel Module fuse.
Jan 23 04:00:51 np0005593294 kernel: ACPI: bus type drm_connector registered
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 23 04:00:51 np0005593294 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Load Kernel Module drm.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Generate network units from Kernel command line.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Apply Kernel Variables.
Jan 23 04:00:51 np0005593294 systemd[1]: Mounting FUSE Control File System...
Jan 23 04:00:51 np0005593294 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Rebuild Hardware Database...
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 23 04:00:51 np0005593294 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Load/Save OS Random Seed...
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Create System Users...
Jan 23 04:00:51 np0005593294 systemd[1]: Mounted FUSE Control File System.
Jan 23 04:00:51 np0005593294 systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 04:00:51 np0005593294 systemd-journald[677]: Received client request to flush runtime journal.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Coldplug All udev Devices.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Load/Save OS Random Seed.
Jan 23 04:00:51 np0005593294 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Create System Users.
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 04:00:51 np0005593294 systemd[1]: Reached target Preparation for Local File Systems.
Jan 23 04:00:51 np0005593294 systemd[1]: Reached target Local File Systems.
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 23 04:00:51 np0005593294 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 23 04:00:51 np0005593294 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 23 04:00:51 np0005593294 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Automatic Boot Loader Update...
Jan 23 04:00:51 np0005593294 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 04:00:51 np0005593294 bootctl[695]: Couldn't find EFI system partition, skipping.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Automatic Boot Loader Update.
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Security Auditing Service...
Jan 23 04:00:51 np0005593294 systemd[1]: Starting RPC Bind...
Jan 23 04:00:51 np0005593294 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 23 04:00:51 np0005593294 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Rebuild Journal Catalog...
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Rebuild Journal Catalog.
Jan 23 04:00:51 np0005593294 systemd[1]: Started RPC Bind.
Jan 23 04:00:51 np0005593294 augenrules[706]: /sbin/augenrules: No change
Jan 23 04:00:51 np0005593294 augenrules[721]: No rules
Jan 23 04:00:51 np0005593294 augenrules[721]: enabled 1
Jan 23 04:00:51 np0005593294 augenrules[721]: failure 1
Jan 23 04:00:51 np0005593294 augenrules[721]: pid 701
Jan 23 04:00:51 np0005593294 augenrules[721]: rate_limit 0
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog_limit 8192
Jan 23 04:00:51 np0005593294 augenrules[721]: lost 0
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog 1
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog_wait_time 60000
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog_wait_time_actual 0
Jan 23 04:00:51 np0005593294 augenrules[721]: enabled 1
Jan 23 04:00:51 np0005593294 augenrules[721]: failure 1
Jan 23 04:00:51 np0005593294 augenrules[721]: pid 701
Jan 23 04:00:51 np0005593294 augenrules[721]: rate_limit 0
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog_limit 8192
Jan 23 04:00:51 np0005593294 augenrules[721]: lost 0
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog 0
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog_wait_time 60000
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog_wait_time_actual 0
Jan 23 04:00:51 np0005593294 augenrules[721]: enabled 1
Jan 23 04:00:51 np0005593294 augenrules[721]: failure 1
Jan 23 04:00:51 np0005593294 augenrules[721]: pid 701
Jan 23 04:00:51 np0005593294 augenrules[721]: rate_limit 0
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog_limit 8192
Jan 23 04:00:51 np0005593294 augenrules[721]: lost 0
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog 0
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog_wait_time 60000
Jan 23 04:00:51 np0005593294 augenrules[721]: backlog_wait_time_actual 0
Jan 23 04:00:51 np0005593294 systemd[1]: Started Security Auditing Service.
Jan 23 04:00:51 np0005593294 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 23 04:00:51 np0005593294 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 23 04:00:52 np0005593294 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 23 04:00:52 np0005593294 systemd[1]: Finished Rebuild Hardware Database.
Jan 23 04:00:52 np0005593294 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 04:00:52 np0005593294 systemd[1]: Starting Update is Completed...
Jan 23 04:00:52 np0005593294 systemd[1]: Finished Update is Completed.
Jan 23 04:00:52 np0005593294 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 04:00:52 np0005593294 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 04:00:52 np0005593294 systemd[1]: Reached target System Initialization.
Jan 23 04:00:52 np0005593294 systemd[1]: Started dnf makecache --timer.
Jan 23 04:00:52 np0005593294 systemd[1]: Started Daily rotation of log files.
Jan 23 04:00:52 np0005593294 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 23 04:00:52 np0005593294 systemd[1]: Reached target Timer Units.
Jan 23 04:00:52 np0005593294 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 04:00:52 np0005593294 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 23 04:00:52 np0005593294 systemd[1]: Reached target Socket Units.
Jan 23 04:00:52 np0005593294 systemd[1]: Starting D-Bus System Message Bus...
Jan 23 04:00:52 np0005593294 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 04:00:52 np0005593294 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 23 04:00:52 np0005593294 systemd[1]: Starting Load Kernel Module configfs...
Jan 23 04:00:52 np0005593294 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 04:00:52 np0005593294 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 04:00:52 np0005593294 systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:00:52 np0005593294 systemd[1]: Started D-Bus System Message Bus.
Jan 23 04:00:52 np0005593294 systemd[1]: Reached target Basic System.
Jan 23 04:00:52 np0005593294 systemd[1]: Starting NTP client/server...
Jan 23 04:00:52 np0005593294 dbus-broker-lau[753]: Ready
Jan 23 04:00:52 np0005593294 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 23 04:00:52 np0005593294 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 23 04:00:52 np0005593294 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 23 04:00:52 np0005593294 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 23 04:00:52 np0005593294 chronyd[781]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 04:00:52 np0005593294 chronyd[781]: Loaded 0 symmetric keys
Jan 23 04:00:52 np0005593294 chronyd[781]: Using right/UTC timezone to obtain leap second data
Jan 23 04:00:52 np0005593294 chronyd[781]: Loaded seccomp filter (level 2)
Jan 23 04:00:52 np0005593294 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 23 04:00:52 np0005593294 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 23 04:00:52 np0005593294 systemd[1]: Starting IPv4 firewall with iptables...
Jan 23 04:00:52 np0005593294 systemd[1]: Started irqbalance daemon.
Jan 23 04:00:52 np0005593294 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 23 04:00:52 np0005593294 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:00:52 np0005593294 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:00:52 np0005593294 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:00:52 np0005593294 systemd[1]: Reached target sshd-keygen.target.
Jan 23 04:00:52 np0005593294 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 23 04:00:52 np0005593294 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 23 04:00:52 np0005593294 kernel: kvm_amd: TSC scaling supported
Jan 23 04:00:52 np0005593294 kernel: kvm_amd: Nested Virtualization enabled
Jan 23 04:00:52 np0005593294 kernel: kvm_amd: Nested Paging enabled
Jan 23 04:00:52 np0005593294 kernel: kvm_amd: LBR virtualization supported
Jan 23 04:00:52 np0005593294 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 23 04:00:52 np0005593294 systemd[1]: Reached target User and Group Name Lookups.
Jan 23 04:00:52 np0005593294 kernel: Console: switching to colour dummy device 80x25
Jan 23 04:00:52 np0005593294 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 23 04:00:52 np0005593294 kernel: [drm] features: -context_init
Jan 23 04:00:52 np0005593294 kernel: [drm] number of scanouts: 1
Jan 23 04:00:52 np0005593294 kernel: [drm] number of cap sets: 0
Jan 23 04:00:52 np0005593294 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 23 04:00:52 np0005593294 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 23 04:00:52 np0005593294 kernel: Console: switching to colour frame buffer device 128x48
Jan 23 04:00:52 np0005593294 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 23 04:00:52 np0005593294 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 23 04:00:52 np0005593294 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 23 04:00:52 np0005593294 systemd[1]: Starting User Login Management...
Jan 23 04:00:52 np0005593294 systemd[1]: Started NTP client/server.
Jan 23 04:00:52 np0005593294 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 23 04:00:53 np0005593294 systemd-logind[807]: New seat seat0.
Jan 23 04:00:53 np0005593294 systemd-logind[807]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 04:00:53 np0005593294 systemd-logind[807]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 04:00:53 np0005593294 systemd[1]: Started User Login Management.
Jan 23 04:00:53 np0005593294 iptables.init[789]: iptables: Applying firewall rules: [  OK  ]
Jan 23 04:00:53 np0005593294 systemd[1]: Finished IPv4 firewall with iptables.
Jan 23 04:00:53 np0005593294 cloud-init[838]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 23 Jan 2026 09:00:53 +0000. Up 7.99 seconds.
Jan 23 04:00:53 np0005593294 systemd[1]: run-cloud\x2dinit-tmp-tmpl3lm0fky.mount: Deactivated successfully.
Jan 23 04:00:53 np0005593294 systemd[1]: Starting Hostname Service...
Jan 23 04:00:53 np0005593294 systemd[1]: Started Hostname Service.
Jan 23 04:00:53 np0005593294 systemd-hostnamed[852]: Hostname set to <np0005593294.novalocal> (static)
Jan 23 04:00:54 np0005593294 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 23 04:00:54 np0005593294 systemd[1]: Reached target Preparation for Network.
Jan 23 04:00:54 np0005593294 systemd[1]: Starting Network Manager...
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.3524] NetworkManager (version 1.54.3-2.el9) is starting... (boot:0ec3f185-e60c-43ea-a74e-c21caf2508ae)
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.3529] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.3612] manager[0x561d9de58000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.3649] hostname: hostname: using hostnamed
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.3649] hostname: static hostname changed from (none) to "np0005593294.novalocal"
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.3657] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.3815] manager[0x561d9de58000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.3816] manager[0x561d9de58000]: rfkill: WWAN hardware radio set enabled
Jan 23 04:00:54 np0005593294 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4041] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4042] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4047] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4048] manager: Networking is enabled by state file
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4052] settings: Loaded settings plugin: keyfile (internal)
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4067] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4101] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4119] dhcp: init: Using DHCP client 'internal'
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4123] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4145] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4157] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4171] device (lo): Activation: starting connection 'lo' (6a1055b1-2674-4e8e-9fff-1fce9dcc1052)
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4187] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4192] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4233] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4241] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4244] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4247] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593294 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4251] device (eth0): carrier: link connected
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4256] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4273] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 04:00:54 np0005593294 systemd[1]: Started Network Manager.
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4281] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4288] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4289] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4293] manager: NetworkManager state is now CONNECTING
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4295] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593294 systemd[1]: Reached target Network.
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4306] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4309] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:00:54 np0005593294 systemd[1]: Starting Network Manager Wait Online...
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4350] dhcp4 (eth0): state changed new lease, address=38.129.56.30
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4359] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 04:00:54 np0005593294 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4379] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593294 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4492] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4495] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4501] device (lo): Activation: successful, device activated.
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4521] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4523] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4527] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4529] device (eth0): Activation: successful, device activated.
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4534] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 04:00:54 np0005593294 NetworkManager[856]: <info>  [1769158854.4537] manager: startup complete
Jan 23 04:00:54 np0005593294 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 23 04:00:54 np0005593294 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 04:00:54 np0005593294 systemd[1]: Reached target NFS client services.
Jan 23 04:00:54 np0005593294 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 04:00:54 np0005593294 systemd[1]: Reached target Remote File Systems.
Jan 23 04:00:54 np0005593294 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 04:00:54 np0005593294 systemd[1]: Finished Network Manager Wait Online.
Jan 23 04:00:54 np0005593294 systemd[1]: Starting Cloud-init: Network Stage...
Jan 23 04:00:54 np0005593294 cloud-init[920]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 23 Jan 2026 09:00:54 +0000. Up 9.23 seconds.
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: |  eth0  | True |         38.129.56.30         | 255.255.255.0 | global | fa:16:3e:02:24:b4 |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe02:24b4/64 |       .       |  link  | fa:16:3e:02:24:b4 |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 23 04:00:54 np0005593294 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 04:00:58 np0005593294 cloud-init[920]: Generating public/private rsa key pair.
Jan 23 04:00:58 np0005593294 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 23 04:00:58 np0005593294 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 23 04:00:58 np0005593294 cloud-init[920]: The key fingerprint is:
Jan 23 04:00:58 np0005593294 cloud-init[920]: SHA256:edOWuOchfDQbnTRH8Bc/YcFEmOHQYCN+DGWnL80pn3E root@np0005593294.novalocal
Jan 23 04:00:58 np0005593294 cloud-init[920]: The key's randomart image is:
Jan 23 04:00:58 np0005593294 cloud-init[920]: +---[RSA 3072]----+
Jan 23 04:00:58 np0005593294 cloud-init[920]: |         o.BooOOo|
Jan 23 04:00:58 np0005593294 cloud-init[920]: |        . * *+.++|
Jan 23 04:00:58 np0005593294 cloud-init[920]: |         . + .o.=|
Jan 23 04:00:58 np0005593294 cloud-init[920]: |         ..o++.+o|
Jan 23 04:00:58 np0005593294 cloud-init[920]: |        S +oB*oE |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |         o =+++  |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |          + =o   |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |           = .   |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |            .    |
Jan 23 04:00:58 np0005593294 cloud-init[920]: +----[SHA256]-----+
Jan 23 04:00:58 np0005593294 cloud-init[920]: Generating public/private ecdsa key pair.
Jan 23 04:00:58 np0005593294 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 23 04:00:58 np0005593294 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 23 04:00:58 np0005593294 cloud-init[920]: The key fingerprint is:
Jan 23 04:00:58 np0005593294 cloud-init[920]: SHA256:o0A+QcrZ0hR2kiK37DAHwDSF5p7eUg06J2fwyvGJpUo root@np0005593294.novalocal
Jan 23 04:00:58 np0005593294 cloud-init[920]: The key's randomart image is:
Jan 23 04:00:58 np0005593294 cloud-init[920]: +---[ECDSA 256]---+
Jan 23 04:00:58 np0005593294 cloud-init[920]: |=oo.*o.          |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |o=+O.o           |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |o==o=            |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |oo+= .           |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |.== *   S        |
Jan 23 04:00:58 np0005593294 cloud-init[920]: | B.B + . .       |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |oE/ . .          |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |.B +             |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |o .              |
Jan 23 04:00:58 np0005593294 cloud-init[920]: +----[SHA256]-----+
Jan 23 04:00:58 np0005593294 cloud-init[920]: Generating public/private ed25519 key pair.
Jan 23 04:00:58 np0005593294 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 23 04:00:58 np0005593294 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 23 04:00:58 np0005593294 cloud-init[920]: The key fingerprint is:
Jan 23 04:00:58 np0005593294 cloud-init[920]: SHA256:yXgWegZQ6Icfku55tlvbeBAKsmPA73B26hK/CNQHO40 root@np0005593294.novalocal
Jan 23 04:00:58 np0005593294 cloud-init[920]: The key's randomart image is:
Jan 23 04:00:58 np0005593294 cloud-init[920]: +--[ED25519 256]--+
Jan 23 04:00:58 np0005593294 cloud-init[920]: |    .o.          |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |    ..           |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |.  o o. .        |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |..o @ o=.o       |
Jan 23 04:00:58 np0005593294 cloud-init[920]: | o.E BooS.       |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |.o+++.o=.        |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |..Boo.  ..       |
Jan 23 04:00:58 np0005593294 cloud-init[920]: |...+o o. +.      |
Jan 23 04:00:58 np0005593294 cloud-init[920]: | .oo.oooo..      |
Jan 23 04:00:58 np0005593294 cloud-init[920]: +----[SHA256]-----+
Jan 23 04:00:58 np0005593294 systemd[1]: Finished Cloud-init: Network Stage.
Jan 23 04:00:58 np0005593294 systemd[1]: Reached target Cloud-config availability.
Jan 23 04:00:58 np0005593294 systemd[1]: Reached target Network is Online.
Jan 23 04:00:58 np0005593294 systemd[1]: Starting Cloud-init: Config Stage...
Jan 23 04:00:58 np0005593294 systemd[1]: Starting Crash recovery kernel arming...
Jan 23 04:00:58 np0005593294 systemd[1]: Starting Notify NFS peers of a restart...
Jan 23 04:00:58 np0005593294 systemd[1]: Starting System Logging Service...
Jan 23 04:00:58 np0005593294 sm-notify[1005]: Version 2.5.4 starting
Jan 23 04:00:58 np0005593294 systemd[1]: Starting OpenSSH server daemon...
Jan 23 04:00:58 np0005593294 systemd[1]: Starting Permit User Sessions...
Jan 23 04:00:58 np0005593294 systemd[1]: Started Notify NFS peers of a restart.
Jan 23 04:00:58 np0005593294 systemd[1]: Started OpenSSH server daemon.
Jan 23 04:00:58 np0005593294 systemd[1]: Finished Permit User Sessions.
Jan 23 04:00:58 np0005593294 rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 23 04:00:58 np0005593294 rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 23 04:00:58 np0005593294 systemd[1]: Started Command Scheduler.
Jan 23 04:00:58 np0005593294 systemd[1]: Started Getty on tty1.
Jan 23 04:00:58 np0005593294 systemd[1]: Started Serial Getty on ttyS0.
Jan 23 04:00:58 np0005593294 systemd[1]: Reached target Login Prompts.
Jan 23 04:00:58 np0005593294 systemd[1]: Started System Logging Service.
Jan 23 04:00:58 np0005593294 systemd[1]: Reached target Multi-User System.
Jan 23 04:00:58 np0005593294 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 23 04:00:58 np0005593294 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 23 04:00:58 np0005593294 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 23 04:00:58 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:00:58 np0005593294 kdumpctl[1019]: kdump: No kdump initial ramdisk found.
Jan 23 04:00:58 np0005593294 kdumpctl[1019]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 23 04:00:58 np0005593294 cloud-init[1152]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 23 Jan 2026 09:00:58 +0000. Up 12.85 seconds.
Jan 23 04:00:58 np0005593294 systemd[1]: Finished Cloud-init: Config Stage.
Jan 23 04:00:58 np0005593294 systemd[1]: Starting Cloud-init: Final Stage...
Jan 23 04:00:58 np0005593294 dracut[1285]: dracut-057-102.git20250818.el9
Jan 23 04:00:58 np0005593294 cloud-init[1303]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 23 Jan 2026 09:00:58 +0000. Up 13.23 seconds.
Jan 23 04:00:58 np0005593294 cloud-init[1314]: #############################################################
Jan 23 04:00:58 np0005593294 dracut[1287]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 23 04:00:58 np0005593294 cloud-init[1316]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 23 04:00:58 np0005593294 cloud-init[1323]: 256 SHA256:o0A+QcrZ0hR2kiK37DAHwDSF5p7eUg06J2fwyvGJpUo root@np0005593294.novalocal (ECDSA)
Jan 23 04:00:58 np0005593294 cloud-init[1328]: 256 SHA256:yXgWegZQ6Icfku55tlvbeBAKsmPA73B26hK/CNQHO40 root@np0005593294.novalocal (ED25519)
Jan 23 04:00:58 np0005593294 cloud-init[1334]: 3072 SHA256:edOWuOchfDQbnTRH8Bc/YcFEmOHQYCN+DGWnL80pn3E root@np0005593294.novalocal (RSA)
Jan 23 04:00:58 np0005593294 cloud-init[1336]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 23 04:00:58 np0005593294 cloud-init[1337]: #############################################################
Jan 23 04:00:58 np0005593294 cloud-init[1303]: Cloud-init v. 24.4-8.el9 finished at Fri, 23 Jan 2026 09:00:58 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 13.42 seconds
Jan 23 04:00:59 np0005593294 systemd[1]: Finished Cloud-init: Final Stage.
Jan 23 04:00:59 np0005593294 systemd[1]: Reached target Cloud-init target.
Jan 23 04:00:59 np0005593294 chronyd[781]: Selected source 198.181.199.82 (2.centos.pool.ntp.org)
Jan 23 04:00:59 np0005593294 chronyd[781]: System clock TAI offset set to 37 seconds
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: memstrack is not available
Jan 23 04:00:59 np0005593294 dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 04:00:59 np0005593294 dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 04:01:00 np0005593294 dracut[1287]: memstrack is not available
Jan 23 04:01:00 np0005593294 dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 04:01:00 np0005593294 dracut[1287]: *** Including module: systemd ***
Jan 23 04:01:00 np0005593294 dracut[1287]: *** Including module: fips ***
Jan 23 04:01:00 np0005593294 dracut[1287]: *** Including module: systemd-initrd ***
Jan 23 04:01:00 np0005593294 dracut[1287]: *** Including module: i18n ***
Jan 23 04:01:00 np0005593294 dracut[1287]: *** Including module: drm ***
Jan 23 04:01:01 np0005593294 dracut[1287]: *** Including module: prefixdevname ***
Jan 23 04:01:01 np0005593294 dracut[1287]: *** Including module: kernel-modules ***
Jan 23 04:01:01 np0005593294 kernel: block vda: the capability attribute has been deprecated.
Jan 23 04:01:01 np0005593294 dracut[1287]: *** Including module: kernel-modules-extra ***
Jan 23 04:01:01 np0005593294 dracut[1287]: *** Including module: qemu ***
Jan 23 04:01:01 np0005593294 dracut[1287]: *** Including module: fstab-sys ***
Jan 23 04:01:01 np0005593294 dracut[1287]: *** Including module: rootfs-block ***
Jan 23 04:01:01 np0005593294 dracut[1287]: *** Including module: terminfo ***
Jan 23 04:01:01 np0005593294 dracut[1287]: *** Including module: udev-rules ***
Jan 23 04:01:02 np0005593294 dracut[1287]: Skipping udev rule: 91-permissions.rules
Jan 23 04:01:02 np0005593294 dracut[1287]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 23 04:01:02 np0005593294 dracut[1287]: *** Including module: virtiofs ***
Jan 23 04:01:02 np0005593294 dracut[1287]: *** Including module: dracut-systemd ***
Jan 23 04:01:02 np0005593294 dracut[1287]: *** Including module: usrmount ***
Jan 23 04:01:02 np0005593294 dracut[1287]: *** Including module: base ***
Jan 23 04:01:02 np0005593294 dracut[1287]: *** Including module: fs-lib ***
Jan 23 04:01:02 np0005593294 dracut[1287]: *** Including module: kdumpbase ***
Jan 23 04:01:03 np0005593294 dracut[1287]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 23 04:01:03 np0005593294 dracut[1287]:  microcode_ctl module: mangling fw_dir
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 23 04:01:03 np0005593294 irqbalance[791]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 23 04:01:03 np0005593294 irqbalance[791]: IRQ 25 affinity is now unmanaged
Jan 23 04:01:03 np0005593294 irqbalance[791]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 23 04:01:03 np0005593294 irqbalance[791]: IRQ 31 affinity is now unmanaged
Jan 23 04:01:03 np0005593294 irqbalance[791]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 23 04:01:03 np0005593294 irqbalance[791]: IRQ 28 affinity is now unmanaged
Jan 23 04:01:03 np0005593294 irqbalance[791]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 23 04:01:03 np0005593294 irqbalance[791]: IRQ 32 affinity is now unmanaged
Jan 23 04:01:03 np0005593294 irqbalance[791]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 23 04:01:03 np0005593294 irqbalance[791]: IRQ 30 affinity is now unmanaged
Jan 23 04:01:03 np0005593294 irqbalance[791]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 23 04:01:03 np0005593294 irqbalance[791]: IRQ 29 affinity is now unmanaged
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 23 04:01:03 np0005593294 dracut[1287]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 23 04:01:03 np0005593294 dracut[1287]: *** Including module: openssl ***
Jan 23 04:01:03 np0005593294 dracut[1287]: *** Including module: shutdown ***
Jan 23 04:01:03 np0005593294 dracut[1287]: *** Including module: squash ***
Jan 23 04:01:03 np0005593294 dracut[1287]: *** Including modules done ***
Jan 23 04:01:03 np0005593294 dracut[1287]: *** Installing kernel module dependencies ***
Jan 23 04:01:04 np0005593294 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:01:04 np0005593294 dracut[1287]: *** Installing kernel module dependencies done ***
Jan 23 04:01:04 np0005593294 dracut[1287]: *** Resolving executable dependencies ***
Jan 23 04:01:06 np0005593294 dracut[1287]: *** Resolving executable dependencies done ***
Jan 23 04:01:06 np0005593294 dracut[1287]: *** Generating early-microcode cpio image ***
Jan 23 04:01:06 np0005593294 dracut[1287]: *** Store current command line parameters ***
Jan 23 04:01:06 np0005593294 dracut[1287]: Stored kernel commandline:
Jan 23 04:01:06 np0005593294 dracut[1287]: No dracut internal kernel commandline stored in the initramfs
Jan 23 04:01:06 np0005593294 dracut[1287]: *** Install squash loader ***
Jan 23 04:01:07 np0005593294 dracut[1287]: *** Squashing the files inside the initramfs ***
Jan 23 04:01:08 np0005593294 dracut[1287]: *** Squashing the files inside the initramfs done ***
Jan 23 04:01:08 np0005593294 dracut[1287]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 23 04:01:08 np0005593294 dracut[1287]: *** Hardlinking files ***
Jan 23 04:01:08 np0005593294 dracut[1287]: *** Hardlinking files done ***
Jan 23 04:01:08 np0005593294 dracut[1287]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 23 04:01:09 np0005593294 kdumpctl[1019]: kdump: kexec: loaded kdump kernel
Jan 23 04:01:09 np0005593294 kdumpctl[1019]: kdump: Starting kdump: [OK]
Jan 23 04:01:09 np0005593294 systemd[1]: Finished Crash recovery kernel arming.
Jan 23 04:01:09 np0005593294 systemd[1]: Startup finished in 2.342s (kernel) + 2.968s (initrd) + 18.552s (userspace) = 23.862s.
Jan 23 04:01:24 np0005593294 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 04:01:44 np0005593294 systemd[1]: Created slice User Slice of UID 1000.
Jan 23 04:01:44 np0005593294 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 23 04:01:44 np0005593294 systemd-logind[807]: New session 1 of user zuul.
Jan 23 04:01:44 np0005593294 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 23 04:01:44 np0005593294 systemd[1]: Starting User Manager for UID 1000...
Jan 23 04:01:44 np0005593294 systemd[4325]: Queued start job for default target Main User Target.
Jan 23 04:01:44 np0005593294 systemd[4325]: Created slice User Application Slice.
Jan 23 04:01:44 np0005593294 systemd[4325]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:01:44 np0005593294 systemd[4325]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:01:44 np0005593294 systemd[4325]: Reached target Paths.
Jan 23 04:01:44 np0005593294 systemd[4325]: Reached target Timers.
Jan 23 04:01:44 np0005593294 systemd[4325]: Starting D-Bus User Message Bus Socket...
Jan 23 04:01:44 np0005593294 systemd[4325]: Starting Create User's Volatile Files and Directories...
Jan 23 04:01:44 np0005593294 systemd[4325]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:01:44 np0005593294 systemd[4325]: Reached target Sockets.
Jan 23 04:01:44 np0005593294 systemd[4325]: Finished Create User's Volatile Files and Directories.
Jan 23 04:01:44 np0005593294 systemd[4325]: Reached target Basic System.
Jan 23 04:01:44 np0005593294 systemd[4325]: Reached target Main User Target.
Jan 23 04:01:44 np0005593294 systemd[4325]: Startup finished in 128ms.
Jan 23 04:01:44 np0005593294 systemd[1]: Started User Manager for UID 1000.
Jan 23 04:01:44 np0005593294 systemd[1]: Started Session 1 of User zuul.
Jan 23 04:01:45 np0005593294 python3[4407]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:01:48 np0005593294 python3[4435]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:01:55 np0005593294 python3[4493]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:01:56 np0005593294 python3[4533]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 23 04:01:58 np0005593294 python3[4559]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQChWBsfs5FtlYIS47KhLNXtsYVhP6UT/w4WYq1l1d/b7+cXPAwAb4Qt1cc/BmNcKM419a6D+CvPejxC67s0h4ksuceBjB/s6b88/zjf8Lio8Dd87f6J+f6IY8ByYIQ8s3Hvn6z0K7HSyEMuQ0B/CLxeBW4MJFqcoLK2v7Y8SNPGLr8w/8y79OWnJJPKmfM4ACTo2JwqmPGI/4+LQsCZS/p/yKDTO5AYxsIUwWw/IX3Jxs67UOBqa40onmgM/VRkfGY512fziVUNkmFHG2Aqgosbpbz/XysrVTpvLRA/H2zpGbbTbuEg6xp8vHQO5V0csAd6p3cdOixjdaPmf9oy3+yXuIeWwnnxPHqvVDY6N9aaIX4vuajxOoMUFiQ2YtcDq7sCn8HoateyYgIL/u2+pInArUiYGemyMEWja0DhD6UdCkY0Ea+YDWeIZKM505N+HClR5jfjjVW35TndY+AldV5OhOzMRmPjtJYS8a0usUXRvmxRfMFSmO9CI1RfNmod9X0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:01:59 np0005593294 python3[4583]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:01:59 np0005593294 python3[4682]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:00 np0005593294 python3[4753]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158919.4722352-252-40954728514482/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=79d1f7c5e92f4d57bb17665cf28be8d8_id_rsa follow=False checksum=70fc72f3adde7c23bd22f0e2ad4ebdd2e15c011a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:00 np0005593294 python3[4876]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:01 np0005593294 python3[4947]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158920.550354-307-128609600897818/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=79d1f7c5e92f4d57bb17665cf28be8d8_id_rsa.pub follow=False checksum=1817e5216c13f90f69486a375706d090e99f2d79 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:02 np0005593294 python3[4995]: ansible-ping Invoked with data=pong
Jan 23 04:02:03 np0005593294 python3[5019]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:02:05 np0005593294 python3[5077]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 23 04:02:07 np0005593294 python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:07 np0005593294 python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:07 np0005593294 python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:07 np0005593294 python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:08 np0005593294 python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:08 np0005593294 python3[5229]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:10 np0005593294 python3[5255]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:10 np0005593294 python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:11 np0005593294 python3[5406]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158930.4675694-32-101246637039073/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:12 np0005593294 python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:12 np0005593294 python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:12 np0005593294 python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:12 np0005593294 python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:13 np0005593294 python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:13 np0005593294 python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:13 np0005593294 python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:14 np0005593294 python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:14 np0005593294 python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:14 np0005593294 python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:14 np0005593294 python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:15 np0005593294 python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:15 np0005593294 python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:15 np0005593294 python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:15 np0005593294 python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:16 np0005593294 python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:16 np0005593294 python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:16 np0005593294 python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:16 np0005593294 python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:17 np0005593294 python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:17 np0005593294 python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:17 np0005593294 python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:18 np0005593294 python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:18 np0005593294 python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:18 np0005593294 python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:18 np0005593294 python3[6054]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:21 np0005593294 python3[6080]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 04:02:21 np0005593294 systemd[1]: Starting Time & Date Service...
Jan 23 04:02:22 np0005593294 systemd[1]: Started Time & Date Service.
Jan 23 04:02:22 np0005593294 systemd-timedated[6082]: Changed time zone to 'UTC' (UTC).
Jan 23 04:02:22 np0005593294 python3[6111]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:22 np0005593294 python3[6187]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:23 np0005593294 python3[6258]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769158942.657495-252-246197202602001/source _original_basename=tmphpdpiom_ follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:23 np0005593294 python3[6358]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:24 np0005593294 python3[6429]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769158943.544987-302-124129721618078/source _original_basename=tmpwhou6zj1 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:24 np0005593294 python3[6531]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:25 np0005593294 python3[6604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769158944.7444382-383-20136879228048/source _original_basename=tmpazju_cwh follow=False checksum=96d192923ef836711213a25c6ed0ba1e0702c4c3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:25 np0005593294 python3[6652]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:02:26 np0005593294 python3[6678]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:02:27 np0005593294 python3[6758]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:28 np0005593294 python3[6831]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158947.3521514-452-112811626565435/source _original_basename=tmpt0kuwa9_ follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:28 np0005593294 python3[6882]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-639e-86bd-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:02:29 np0005593294 python3[6910]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-639e-86bd-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 23 04:02:30 np0005593294 python3[6939]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:48 np0005593294 python3[6965]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:52 np0005593294 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 04:03:48 np0005593294 systemd-logind[807]: Session 1 logged out. Waiting for processes to exit.
Jan 23 04:03:54 np0005593294 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 04:03:54 np0005593294 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 23 04:03:54 np0005593294 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 23 04:03:54 np0005593294 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 23 04:03:54 np0005593294 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 23 04:03:54 np0005593294 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 23 04:03:54 np0005593294 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 23 04:03:54 np0005593294 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 23 04:03:54 np0005593294 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 23 04:03:54 np0005593294 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4082] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 04:03:54 np0005593294 systemd-udevd[6969]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4352] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4372] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4374] device (eth1): carrier: link connected
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4376] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4380] policy: auto-activating connection 'Wired connection 1' (c9ce933b-996c-3254-bccc-8d3373d274f1)
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4383] device (eth1): Activation: starting connection 'Wired connection 1' (c9ce933b-996c-3254-bccc-8d3373d274f1)
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4384] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4386] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4389] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:03:54 np0005593294 NetworkManager[856]: <info>  [1769159034.4392] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:03:54 np0005593294 systemd[4325]: Starting Mark boot as successful...
Jan 23 04:03:54 np0005593294 systemd[4325]: Finished Mark boot as successful.
Jan 23 04:03:55 np0005593294 systemd-logind[807]: New session 3 of user zuul.
Jan 23 04:03:55 np0005593294 systemd[1]: Started Session 3 of User zuul.
Jan 23 04:03:55 np0005593294 python3[7000]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-4543-3693-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:04:05 np0005593294 python3[7080]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:04:05 np0005593294 python3[7153]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159045.3941975-155-196895230060800/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=cf8ee7cd7bc1fd6d9388d3c03a8ac5811adcc451 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:04:06 np0005593294 python3[7203]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:04:06 np0005593294 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 04:04:06 np0005593294 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 04:04:06 np0005593294 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 04:04:06 np0005593294 systemd[1]: Stopping Network Manager...
Jan 23 04:04:06 np0005593294 NetworkManager[856]: <info>  [1769159046.5967] caught SIGTERM, shutting down normally.
Jan 23 04:04:06 np0005593294 NetworkManager[856]: <info>  [1769159046.5980] dhcp4 (eth0): canceled DHCP transaction
Jan 23 04:04:06 np0005593294 NetworkManager[856]: <info>  [1769159046.5981] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:04:06 np0005593294 NetworkManager[856]: <info>  [1769159046.5981] dhcp4 (eth0): state changed no lease
Jan 23 04:04:06 np0005593294 NetworkManager[856]: <info>  [1769159046.5983] manager: NetworkManager state is now CONNECTING
Jan 23 04:04:06 np0005593294 NetworkManager[856]: <info>  [1769159046.6115] dhcp4 (eth1): canceled DHCP transaction
Jan 23 04:04:06 np0005593294 NetworkManager[856]: <info>  [1769159046.6115] dhcp4 (eth1): state changed no lease
Jan 23 04:04:06 np0005593294 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:04:06 np0005593294 NetworkManager[856]: <info>  [1769159046.6186] exiting (success)
Jan 23 04:04:06 np0005593294 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:04:06 np0005593294 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 04:04:06 np0005593294 systemd[1]: Stopped Network Manager.
Jan 23 04:04:06 np0005593294 systemd[1]: NetworkManager.service: Consumed 1.209s CPU time, 10.2M memory peak.
Jan 23 04:04:06 np0005593294 systemd[1]: Starting Network Manager...
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.6775] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:0ec3f185-e60c-43ea-a74e-c21caf2508ae)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.6776] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.6819] manager[0x55f86ffe1000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 04:04:06 np0005593294 systemd[1]: Starting Hostname Service...
Jan 23 04:04:06 np0005593294 systemd[1]: Started Hostname Service.
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.7940] hostname: hostname: using hostnamed
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.7941] hostname: static hostname changed from (none) to "np0005593294.novalocal"
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.7949] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.7957] manager[0x55f86ffe1000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.7958] manager[0x55f86ffe1000]: rfkill: WWAN hardware radio set enabled
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8007] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8008] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8009] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8010] manager: Networking is enabled by state file
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8015] settings: Loaded settings plugin: keyfile (internal)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8021] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8062] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8078] dhcp: init: Using DHCP client 'internal'
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8082] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8090] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8098] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8111] device (lo): Activation: starting connection 'lo' (6a1055b1-2674-4e8e-9fff-1fce9dcc1052)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8121] device (eth0): carrier: link connected
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8128] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8135] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8136] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8146] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8155] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8163] device (eth1): carrier: link connected
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8170] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8178] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (c9ce933b-996c-3254-bccc-8d3373d274f1) (indicated)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8178] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8187] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8198] device (eth1): Activation: starting connection 'Wired connection 1' (c9ce933b-996c-3254-bccc-8d3373d274f1)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8206] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 04:04:06 np0005593294 systemd[1]: Started Network Manager.
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8213] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8221] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8224] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8228] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8233] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8237] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8242] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8247] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8260] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8266] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8277] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8281] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8312] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8320] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8327] device (lo): Activation: successful, device activated.
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8338] dhcp4 (eth0): state changed new lease, address=38.129.56.30
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8348] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 04:04:06 np0005593294 systemd[1]: Starting Network Manager Wait Online...
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8438] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8471] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8474] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8481] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8488] device (eth0): Activation: successful, device activated.
Jan 23 04:04:06 np0005593294 NetworkManager[7216]: <info>  [1769159046.8499] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 04:04:07 np0005593294 python3[7288]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-4543-3693-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:04:16 np0005593294 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:04:36 np0005593294 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5256] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 04:04:52 np0005593294 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:04:52 np0005593294 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5587] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5591] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5600] device (eth1): Activation: successful, device activated.
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5608] manager: startup complete
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5610] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <warn>  [1769159092.5619] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5627] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 23 04:04:52 np0005593294 systemd[1]: Finished Network Manager Wait Online.
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5805] dhcp4 (eth1): canceled DHCP transaction
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5806] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5806] dhcp4 (eth1): state changed no lease
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5822] policy: auto-activating connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50)
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5826] device (eth1): Activation: starting connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50)
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5827] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5831] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5838] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5849] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5893] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5896] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:04:52 np0005593294 NetworkManager[7216]: <info>  [1769159092.5903] device (eth1): Activation: successful, device activated.
Jan 23 04:05:02 np0005593294 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:05:07 np0005593294 systemd[1]: session-3.scope: Deactivated successfully.
Jan 23 04:05:07 np0005593294 systemd[1]: session-3.scope: Consumed 1.462s CPU time.
Jan 23 04:05:07 np0005593294 systemd-logind[807]: Session 3 logged out. Waiting for processes to exit.
Jan 23 04:05:07 np0005593294 systemd-logind[807]: Removed session 3.
Jan 23 04:05:53 np0005593294 systemd-logind[807]: New session 4 of user zuul.
Jan 23 04:05:53 np0005593294 systemd[1]: Started Session 4 of User zuul.
Jan 23 04:05:53 np0005593294 python3[7397]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:05:54 np0005593294 python3[7470]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159153.5164309-373-212687453147454/source _original_basename=tmp62uv7bel follow=False checksum=6e1e8970cf6ad2f0b1a32d462d71e8a0528ec2d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:05:57 np0005593294 systemd[1]: session-4.scope: Deactivated successfully.
Jan 23 04:05:57 np0005593294 systemd-logind[807]: Session 4 logged out. Waiting for processes to exit.
Jan 23 04:05:57 np0005593294 systemd-logind[807]: Removed session 4.
Jan 23 04:06:57 np0005593294 systemd[4325]: Created slice User Background Tasks Slice.
Jan 23 04:06:57 np0005593294 systemd[4325]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 04:06:57 np0005593294 systemd[4325]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 04:14:49 np0005593294 systemd-logind[807]: New session 5 of user zuul.
Jan 23 04:14:49 np0005593294 systemd[1]: Started Session 5 of User zuul.
Jan 23 04:14:50 np0005593294 python3[7531]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-5353-1fb2-00000000217f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:50 np0005593294 python3[7560]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:50 np0005593294 python3[7586]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:51 np0005593294 python3[7612]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:51 np0005593294 python3[7638]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:52 np0005593294 python3[7664]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:52 np0005593294 python3[7742]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:14:53 np0005593294 python3[7815]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159692.4885798-545-101184356020038/source _original_basename=tmp9yq7mp4q follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:54 np0005593294 python3[7865]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:14:54 np0005593294 systemd[1]: Reloading.
Jan 23 04:14:54 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:56 np0005593294 python3[7921]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 23 04:14:56 np0005593294 python3[7947]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:57 np0005593294 python3[7975]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:57 np0005593294 python3[8003]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:57 np0005593294 python3[8031]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:58 np0005593294 python3[8058]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-5353-1fb2-000000002186-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:59 np0005593294 python3[8088]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 04:15:03 np0005593294 systemd[1]: session-5.scope: Deactivated successfully.
Jan 23 04:15:03 np0005593294 systemd[1]: session-5.scope: Consumed 3.835s CPU time.
Jan 23 04:15:03 np0005593294 systemd-logind[807]: Session 5 logged out. Waiting for processes to exit.
Jan 23 04:15:03 np0005593294 systemd-logind[807]: Removed session 5.
Jan 23 04:15:05 np0005593294 systemd-logind[807]: New session 6 of user zuul.
Jan 23 04:15:05 np0005593294 systemd[1]: Started Session 6 of User zuul.
Jan 23 04:15:05 np0005593294 python3[8124]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 04:15:13 np0005593294 setsebool[8167]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 23 04:15:13 np0005593294 setsebool[8167]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 23 04:15:13 np0005593294 irqbalance[791]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 23 04:15:13 np0005593294 irqbalance[791]: IRQ 27 affinity is now unmanaged
Jan 23 04:15:29 np0005593294 kernel: SELinux:  Converting 386 SID table entries...
Jan 23 04:15:29 np0005593294 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:15:29 np0005593294 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:15:29 np0005593294 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:15:29 np0005593294 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:15:29 np0005593294 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:15:29 np0005593294 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:15:29 np0005593294 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:15:42 np0005593294 kernel: SELinux:  Converting 389 SID table entries...
Jan 23 04:15:42 np0005593294 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:15:42 np0005593294 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:15:42 np0005593294 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:15:42 np0005593294 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:15:42 np0005593294 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:15:42 np0005593294 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:15:42 np0005593294 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:15:57 np0005593294 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 04:15:57 np0005593294 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 23 04:15:57 np0005593294 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 23 04:15:57 np0005593294 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 23 04:15:57 np0005593294 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 23 04:16:02 np0005593294 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:16:02 np0005593294 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:16:02 np0005593294 systemd[1]: Reloading.
Jan 23 04:16:02 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:16:02 np0005593294 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:16:15 np0005593294 python3[16348]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f136-f057-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:16:18 np0005593294 kernel: evm: overlay not supported
Jan 23 04:16:20 np0005593294 systemd[4325]: Starting D-Bus User Message Bus...
Jan 23 04:16:20 np0005593294 dbus-broker-launch[17548]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 23 04:16:20 np0005593294 dbus-broker-launch[17548]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 23 04:16:20 np0005593294 systemd[4325]: Started D-Bus User Message Bus.
Jan 23 04:16:20 np0005593294 dbus-broker-lau[17548]: Ready
Jan 23 04:16:20 np0005593294 systemd[4325]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 04:16:20 np0005593294 systemd[4325]: Created slice Slice /user.
Jan 23 04:16:20 np0005593294 systemd[4325]: podman-16860.scope: unit configures an IP firewall, but not running as root.
Jan 23 04:16:20 np0005593294 systemd[4325]: (This warning is only shown for the first unit using IP firewalling.)
Jan 23 04:16:20 np0005593294 systemd[4325]: Started podman-16860.scope.
Jan 23 04:16:20 np0005593294 systemd[4325]: Started podman-pause-3e59ae14.scope.
Jan 23 04:16:20 np0005593294 systemd[1]: session-6.scope: Deactivated successfully.
Jan 23 04:16:20 np0005593294 systemd[1]: session-6.scope: Consumed 49.240s CPU time.
Jan 23 04:16:20 np0005593294 systemd-logind[807]: Session 6 logged out. Waiting for processes to exit.
Jan 23 04:16:20 np0005593294 systemd-logind[807]: Removed session 6.
Jan 23 04:16:41 np0005593294 systemd-logind[807]: New session 7 of user zuul.
Jan 23 04:16:41 np0005593294 systemd[1]: Started Session 7 of User zuul.
Jan 23 04:16:42 np0005593294 python3[26035]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:16:42 np0005593294 python3[26249]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:16:43 np0005593294 python3[26612]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005593294.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 23 04:16:44 np0005593294 python3[27156]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:16:45 np0005593294 python3[27274]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:16:45 np0005593294 python3[27446]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159804.7710958-151-62106867819167/source _original_basename=tmp1c5_jq48 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:16:46 np0005593294 python3[27766]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 23 04:16:46 np0005593294 systemd[1]: Starting Hostname Service...
Jan 23 04:16:46 np0005593294 systemd[1]: Started Hostname Service.
Jan 23 04:16:46 np0005593294 systemd-hostnamed[27875]: Changed pretty hostname to 'compute-1'
Jan 23 04:16:46 np0005593294 systemd-hostnamed[27875]: Hostname set to <compute-1> (static)
Jan 23 04:16:46 np0005593294 NetworkManager[7216]: <info>  [1769159806.6029] hostname: static hostname changed from "np0005593294.novalocal" to "compute-1"
Jan 23 04:16:46 np0005593294 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:16:46 np0005593294 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:16:47 np0005593294 systemd[1]: session-7.scope: Deactivated successfully.
Jan 23 04:16:47 np0005593294 systemd[1]: session-7.scope: Consumed 2.476s CPU time.
Jan 23 04:16:47 np0005593294 systemd-logind[807]: Session 7 logged out. Waiting for processes to exit.
Jan 23 04:16:47 np0005593294 systemd-logind[807]: Removed session 7.
Jan 23 04:16:56 np0005593294 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:16:56 np0005593294 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:16:56 np0005593294 systemd[1]: man-db-cache-update.service: Consumed 53.471s CPU time.
Jan 23 04:16:56 np0005593294 systemd[1]: run-ra6e1f36ccf3f47abb7de2cd3ca88c949.service: Deactivated successfully.
Jan 23 04:16:56 np0005593294 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:17:16 np0005593294 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 04:21:46 np0005593294 systemd-logind[807]: New session 8 of user zuul.
Jan 23 04:21:46 np0005593294 systemd[1]: Started Session 8 of User zuul.
Jan 23 04:21:47 np0005593294 python3[30018]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:21:48 np0005593294 python3[30134]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:49 np0005593294 python3[30207]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:49 np0005593294 python3[30233]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:50 np0005593294 python3[30306]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:50 np0005593294 python3[30332]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:50 np0005593294 python3[30405]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:50 np0005593294 python3[30431]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:51 np0005593294 python3[30504]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:51 np0005593294 python3[30530]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:51 np0005593294 python3[30603]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:52 np0005593294 python3[30629]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:52 np0005593294 python3[30702]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:52 np0005593294 python3[30728]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:53 np0005593294 python3[30801]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:05 np0005593294 python3[30850]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:27:05 np0005593294 systemd[1]: session-8.scope: Deactivated successfully.
Jan 23 04:27:05 np0005593294 systemd[1]: session-8.scope: Consumed 4.809s CPU time.
Jan 23 04:27:05 np0005593294 systemd-logind[807]: Session 8 logged out. Waiting for processes to exit.
Jan 23 04:27:05 np0005593294 systemd-logind[807]: Removed session 8.
Jan 23 04:36:57 np0005593294 systemd[1]: Starting dnf makecache...
Jan 23 04:36:57 np0005593294 dnf[30871]: Failed determining last makecache time.
Jan 23 04:36:57 np0005593294 dnf[30871]: delorean-openstack-barbican-42b4c41831408a8e323 375 kB/s |  13 kB     00:00
Jan 23 04:36:57 np0005593294 dnf[30871]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.4 MB/s |  65 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.3 MB/s |  32 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-python-stevedore-c4acc5639fd2329372142 4.7 MB/s | 131 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.2 MB/s |  32 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-os-refresh-config-9bfc52b5049be2d8de61  11 MB/s | 349 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 386 kB/s |  42 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-python-designate-tests-tempest-347fdbc 551 kB/s |  18 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-openstack-glance-1fd12c29b339f30fe823e 517 kB/s |  18 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.1 MB/s |  29 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-openstack-manila-3c01b7181572c95dac462 1.0 MB/s |  25 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-python-whitebox-neutron-tests-tempest- 5.0 MB/s | 154 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-openstack-octavia-ba397f07a7331190208c 975 kB/s |  26 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-openstack-watcher-c014f81a8647287f6dcc 641 kB/s |  16 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-ansible-config_template-5ccaa22121a7ff 310 kB/s | 7.4 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 4.3 MB/s | 144 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-openstack-swift-dc98a8463506ac520c469a 541 kB/s |  14 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-python-tempestconf-8515371b7cceebd4282 2.2 MB/s |  53 kB     00:00
Jan 23 04:36:58 np0005593294 dnf[30871]: delorean-openstack-heat-ui-013accbfd179753bc3f0 2.8 MB/s |  96 kB     00:00
Jan 23 04:36:59 np0005593294 dnf[30871]: CentOS Stream 9 - BaseOS                         66 kB/s | 6.7 kB     00:00
Jan 23 04:36:59 np0005593294 dnf[30871]: CentOS Stream 9 - AppStream                      67 kB/s | 6.8 kB     00:00
Jan 23 04:36:59 np0005593294 dnf[30871]: CentOS Stream 9 - CRB                            55 kB/s | 6.6 kB     00:00
Jan 23 04:36:59 np0005593294 dnf[30871]: CentOS Stream 9 - Extras packages                69 kB/s | 7.3 kB     00:00
Jan 23 04:36:59 np0005593294 dnf[30871]: dlrn-antelope-testing                            26 MB/s | 1.1 MB     00:00
Jan 23 04:37:00 np0005593294 dnf[30871]: dlrn-antelope-build-deps                         10 MB/s | 461 kB     00:00
Jan 23 04:37:00 np0005593294 dnf[30871]: centos9-rabbitmq                                7.5 MB/s | 123 kB     00:00
Jan 23 04:37:00 np0005593294 dnf[30871]: centos9-storage                                  20 MB/s | 415 kB     00:00
Jan 23 04:37:00 np0005593294 dnf[30871]: centos9-opstools                                3.7 MB/s |  51 kB     00:00
Jan 23 04:37:00 np0005593294 dnf[30871]: NFV SIG OpenvSwitch                              19 MB/s | 461 kB     00:00
Jan 23 04:37:01 np0005593294 dnf[30871]: repo-setup-centos-appstream                      89 MB/s |  26 MB     00:00
Jan 23 04:37:07 np0005593294 dnf[30871]: repo-setup-centos-baseos                         62 MB/s | 8.9 MB     00:00
Jan 23 04:37:09 np0005593294 dnf[30871]: repo-setup-centos-highavailability               29 MB/s | 744 kB     00:00
Jan 23 04:37:09 np0005593294 dnf[30871]: repo-setup-centos-powertools                     64 MB/s | 7.6 MB     00:00
Jan 23 04:37:12 np0005593294 dnf[30871]: Extra Packages for Enterprise Linux 9 - x86_64   18 MB/s |  20 MB     00:01
Jan 23 04:37:13 np0005593294 systemd-logind[807]: New session 9 of user zuul.
Jan 23 04:37:14 np0005593294 systemd[1]: Started Session 9 of User zuul.
Jan 23 04:37:15 np0005593294 python3.9[31127]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:37:16 np0005593294 python3.9[31308]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:37:26 np0005593294 dnf[30871]: Metadata cache created.
Jan 23 04:37:26 np0005593294 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 04:37:26 np0005593294 systemd[1]: Finished dnf makecache.
Jan 23 04:37:26 np0005593294 systemd[1]: dnf-makecache.service: Consumed 26.725s CPU time.
Jan 23 04:37:29 np0005593294 systemd[1]: session-9.scope: Deactivated successfully.
Jan 23 04:37:29 np0005593294 systemd[1]: session-9.scope: Consumed 8.179s CPU time.
Jan 23 04:37:29 np0005593294 systemd-logind[807]: Session 9 logged out. Waiting for processes to exit.
Jan 23 04:37:29 np0005593294 systemd-logind[807]: Removed session 9.
Jan 23 04:37:46 np0005593294 systemd-logind[807]: New session 10 of user zuul.
Jan 23 04:37:46 np0005593294 systemd[1]: Started Session 10 of User zuul.
Jan 23 04:37:47 np0005593294 python3.9[31521]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 04:37:49 np0005593294 python3.9[31695]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:37:49 np0005593294 python3.9[31847]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:37:51 np0005593294 python3.9[32000]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:37:52 np0005593294 python3.9[32152]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:37:52 np0005593294 python3.9[32304]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:37:53 np0005593294 python3.9[32427]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161072.4290197-173-99330198734100/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:37:54 np0005593294 python3.9[32579]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:37:55 np0005593294 python3.9[32735]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:37:56 np0005593294 python3.9[32887]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:37:57 np0005593294 python3.9[33037]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:38:00 np0005593294 python3.9[33290]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:38:01 np0005593294 python3.9[33440]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:38:03 np0005593294 python3.9[33594]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:38:04 np0005593294 python3.9[33752]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:38:05 np0005593294 python3.9[33836]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:38:38 np0005593294 systemd[1]: Reloading.
Jan 23 04:38:38 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:38:38 np0005593294 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 23 04:38:38 np0005593294 systemd[1]: Reloading.
Jan 23 04:38:38 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:38:39 np0005593294 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 23 04:38:39 np0005593294 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 23 04:38:39 np0005593294 systemd[1]: Reloading.
Jan 23 04:38:39 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:38:39 np0005593294 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 23 04:38:39 np0005593294 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 04:38:39 np0005593294 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 04:38:39 np0005593294 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 04:39:48 np0005593294 kernel: SELinux:  Converting 2724 SID table entries...
Jan 23 04:39:48 np0005593294 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:39:48 np0005593294 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:39:48 np0005593294 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:39:48 np0005593294 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:39:48 np0005593294 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:39:48 np0005593294 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:39:48 np0005593294 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:39:49 np0005593294 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 23 04:39:49 np0005593294 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:39:49 np0005593294 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:39:49 np0005593294 systemd[1]: Reloading.
Jan 23 04:39:49 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:39:49 np0005593294 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:39:51 np0005593294 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:39:51 np0005593294 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:39:51 np0005593294 systemd[1]: man-db-cache-update.service: Consumed 1.462s CPU time.
Jan 23 04:39:51 np0005593294 systemd[1]: run-r9f3775315b3e4c6b987812721a80dfba.service: Deactivated successfully.
Jan 23 04:40:32 np0005593294 python3.9[35347]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:40:34 np0005593294 python3.9[35628]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 04:40:35 np0005593294 python3.9[35780]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 04:40:40 np0005593294 python3.9[35933]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:40:41 np0005593294 python3.9[36085]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 04:40:45 np0005593294 python3.9[36237]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:40:49 np0005593294 python3.9[36389]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:40:50 np0005593294 python3.9[36512]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161248.7896843-662-194883927395366/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:40:56 np0005593294 python3.9[36664]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:40:57 np0005593294 python3.9[36816]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:40:57 np0005593294 python3.9[36969]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:40:59 np0005593294 python3.9[37121]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 04:41:00 np0005593294 python3.9[37274]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:41:00 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:41:01 np0005593294 python3.9[37433]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 04:41:02 np0005593294 python3.9[37593]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 04:41:03 np0005593294 python3.9[37746]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:41:04 np0005593294 python3.9[37904]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 04:41:06 np0005593294 python3.9[38056]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:41:11 np0005593294 python3.9[38209]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:41:12 np0005593294 python3.9[38361]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:41:12 np0005593294 python3.9[38484]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161271.7468174-1019-30385254113136/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:41:13 np0005593294 irqbalance[791]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 23 04:41:13 np0005593294 irqbalance[791]: IRQ 26 affinity is now unmanaged
Jan 23 04:41:13 np0005593294 python3.9[38636]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:41:13 np0005593294 systemd[1]: Starting Load Kernel Modules...
Jan 23 04:41:13 np0005593294 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 23 04:41:13 np0005593294 kernel: Bridge firewalling registered
Jan 23 04:41:13 np0005593294 systemd-modules-load[38640]: Inserted module 'br_netfilter'
Jan 23 04:41:13 np0005593294 systemd[1]: Finished Load Kernel Modules.
Jan 23 04:41:14 np0005593294 python3.9[38796]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:41:15 np0005593294 python3.9[38919]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161274.202263-1088-42987376531943/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:41:16 np0005593294 python3.9[39071]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:41:19 np0005593294 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 04:41:19 np0005593294 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 04:41:20 np0005593294 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:41:20 np0005593294 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:41:20 np0005593294 systemd[1]: Reloading.
Jan 23 04:41:20 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:41:20 np0005593294 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:41:23 np0005593294 python3.9[41978]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:41:23 np0005593294 python3.9[42770]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 04:41:24 np0005593294 python3.9[43088]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:41:25 np0005593294 python3.9[43240]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:25 np0005593294 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 04:41:25 np0005593294 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:41:25 np0005593294 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:41:25 np0005593294 systemd[1]: man-db-cache-update.service: Consumed 5.047s CPU time.
Jan 23 04:41:25 np0005593294 systemd[1]: run-rbafa2d86e4ae41b2ab845ad0ab743bb3.service: Deactivated successfully.
Jan 23 04:41:26 np0005593294 systemd[1]: Starting Authorization Manager...
Jan 23 04:41:26 np0005593294 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 04:41:26 np0005593294 polkitd[43458]: Started polkitd version 0.117
Jan 23 04:41:26 np0005593294 systemd[1]: Started Authorization Manager.
Jan 23 04:41:27 np0005593294 python3.9[43628]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:41:27 np0005593294 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 04:41:27 np0005593294 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 04:41:27 np0005593294 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 04:41:27 np0005593294 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 04:41:28 np0005593294 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 04:41:29 np0005593294 python3.9[43790]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 04:41:32 np0005593294 python3.9[43942]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:41:32 np0005593294 systemd[1]: Reloading.
Jan 23 04:41:32 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:41:33 np0005593294 python3.9[44132]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:41:33 np0005593294 systemd[1]: Reloading.
Jan 23 04:41:33 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:41:34 np0005593294 python3.9[44321]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:35 np0005593294 python3.9[44474]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:35 np0005593294 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 23 04:41:36 np0005593294 python3.9[44627]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:38 np0005593294 python3.9[44789]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:39 np0005593294 python3.9[44942]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:41:39 np0005593294 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 04:41:39 np0005593294 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 04:41:39 np0005593294 systemd[1]: Stopping Apply Kernel Variables...
Jan 23 04:41:39 np0005593294 systemd[1]: Starting Apply Kernel Variables...
Jan 23 04:41:39 np0005593294 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 04:41:39 np0005593294 systemd[1]: Finished Apply Kernel Variables.
Jan 23 04:41:40 np0005593294 systemd[1]: session-10.scope: Deactivated successfully.
Jan 23 04:41:40 np0005593294 systemd[1]: session-10.scope: Consumed 2min 8.519s CPU time.
Jan 23 04:41:40 np0005593294 systemd-logind[807]: Session 10 logged out. Waiting for processes to exit.
Jan 23 04:41:40 np0005593294 systemd-logind[807]: Removed session 10.
Jan 23 04:41:47 np0005593294 systemd-logind[807]: New session 11 of user zuul.
Jan 23 04:41:47 np0005593294 systemd[1]: Started Session 11 of User zuul.
Jan 23 04:41:48 np0005593294 python3.9[45127]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:41:50 np0005593294 python3.9[45283]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 04:41:51 np0005593294 python3.9[45436]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:41:52 np0005593294 python3.9[45594]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 04:41:56 np0005593294 python3.9[45754]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:41:56 np0005593294 python3.9[45838]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:42:00 np0005593294 python3.9[46001]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:42:12 np0005593294 kernel: SELinux:  Converting 2736 SID table entries...
Jan 23 04:42:12 np0005593294 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:42:12 np0005593294 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:42:12 np0005593294 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:42:12 np0005593294 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:42:12 np0005593294 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:42:12 np0005593294 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:42:12 np0005593294 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:42:13 np0005593294 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 23 04:42:13 np0005593294 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 23 04:42:15 np0005593294 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:42:15 np0005593294 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:42:15 np0005593294 systemd[1]: Reloading.
Jan 23 04:42:15 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:42:15 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:42:15 np0005593294 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:42:16 np0005593294 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:42:16 np0005593294 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:42:16 np0005593294 systemd[1]: run-r3da8bb381b17428c89ac1ac1d3e19566.service: Deactivated successfully.
Jan 23 04:42:24 np0005593294 python3.9[47099]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:42:24 np0005593294 systemd[1]: Reloading.
Jan 23 04:42:25 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:42:25 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:42:25 np0005593294 systemd[1]: Starting Open vSwitch Database Unit...
Jan 23 04:42:25 np0005593294 chown[47141]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 23 04:42:25 np0005593294 ovs-ctl[47146]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 23 04:42:25 np0005593294 ovs-ctl[47146]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 23 04:42:25 np0005593294 ovs-ctl[47146]: Starting ovsdb-server [  OK  ]
Jan 23 04:42:25 np0005593294 ovs-vsctl[47195]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 23 04:42:25 np0005593294 ovs-vsctl[47211]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"170ec811-bf2b-4b3a-9339-50a49c79a1e6\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 23 04:42:25 np0005593294 ovs-ctl[47146]: Configuring Open vSwitch system IDs [  OK  ]
Jan 23 04:42:25 np0005593294 ovs-ctl[47146]: Enabling remote OVSDB managers [  OK  ]
Jan 23 04:42:25 np0005593294 systemd[1]: Started Open vSwitch Database Unit.
Jan 23 04:42:25 np0005593294 ovs-vsctl[47221]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 23 04:42:25 np0005593294 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 23 04:42:25 np0005593294 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 23 04:42:25 np0005593294 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 23 04:42:25 np0005593294 kernel: openvswitch: Open vSwitch switching datapath
Jan 23 04:42:25 np0005593294 ovs-ctl[47266]: Inserting openvswitch module [  OK  ]
Jan 23 04:42:25 np0005593294 ovs-ctl[47235]: Starting ovs-vswitchd [  OK  ]
Jan 23 04:42:25 np0005593294 ovs-ctl[47235]: Enabling remote OVSDB managers [  OK  ]
Jan 23 04:42:25 np0005593294 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 23 04:42:25 np0005593294 ovs-vsctl[47283]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 23 04:42:25 np0005593294 systemd[1]: Starting Open vSwitch...
Jan 23 04:42:25 np0005593294 systemd[1]: Finished Open vSwitch.
Jan 23 04:42:27 np0005593294 python3.9[47435]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:42:28 np0005593294 python3.9[47587]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 04:42:30 np0005593294 kernel: SELinux:  Converting 2750 SID table entries...
Jan 23 04:42:30 np0005593294 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:42:30 np0005593294 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:42:30 np0005593294 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:42:30 np0005593294 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:42:30 np0005593294 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:42:30 np0005593294 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:42:30 np0005593294 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:42:31 np0005593294 python3.9[47742]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:42:32 np0005593294 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 23 04:42:32 np0005593294 python3.9[47900]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:42:35 np0005593294 python3.9[48053]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:42:37 np0005593294 python3.9[48340]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 04:42:38 np0005593294 python3.9[48490]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:42:38 np0005593294 python3.9[48644]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:42:40 np0005593294 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:42:40 np0005593294 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:42:40 np0005593294 systemd[1]: Reloading.
Jan 23 04:42:40 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:42:40 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:42:41 np0005593294 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:42:41 np0005593294 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:42:41 np0005593294 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:42:41 np0005593294 systemd[1]: run-r2070ae03a98d45fbb5dd949472118885.service: Deactivated successfully.
Jan 23 04:42:45 np0005593294 python3.9[48960]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:42:45 np0005593294 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 04:42:45 np0005593294 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 04:42:45 np0005593294 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 04:42:45 np0005593294 systemd[1]: Stopping Network Manager...
Jan 23 04:42:45 np0005593294 NetworkManager[7216]: <info>  [1769161365.1029] caught SIGTERM, shutting down normally.
Jan 23 04:42:45 np0005593294 NetworkManager[7216]: <info>  [1769161365.1046] dhcp4 (eth0): canceled DHCP transaction
Jan 23 04:42:45 np0005593294 NetworkManager[7216]: <info>  [1769161365.1046] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:42:45 np0005593294 NetworkManager[7216]: <info>  [1769161365.1046] dhcp4 (eth0): state changed no lease
Jan 23 04:42:45 np0005593294 NetworkManager[7216]: <info>  [1769161365.1048] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 04:42:45 np0005593294 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:42:45 np0005593294 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:42:45 np0005593294 NetworkManager[7216]: <info>  [1769161365.2011] exiting (success)
Jan 23 04:42:45 np0005593294 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 04:42:45 np0005593294 systemd[1]: Stopped Network Manager.
Jan 23 04:42:45 np0005593294 systemd[1]: NetworkManager.service: Consumed 14.428s CPU time, 4.1M memory peak, read 0B from disk, written 17.0K to disk.
Jan 23 04:42:45 np0005593294 systemd[1]: Starting Network Manager...
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.2738] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:0ec3f185-e60c-43ea-a74e-c21caf2508ae)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.2739] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.2806] manager[0x555d442be000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 04:42:45 np0005593294 systemd[1]: Starting Hostname Service...
Jan 23 04:42:45 np0005593294 systemd[1]: Started Hostname Service.
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3653] hostname: hostname: using hostnamed
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3655] hostname: static hostname changed from (none) to "compute-1"
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3659] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3662] manager[0x555d442be000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3663] manager[0x555d442be000]: rfkill: WWAN hardware radio set enabled
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3682] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3691] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3692] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3692] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3693] manager: Networking is enabled by state file
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3695] settings: Loaded settings plugin: keyfile (internal)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3699] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3726] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3733] dhcp: init: Using DHCP client 'internal'
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3736] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3741] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3745] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3752] device (lo): Activation: starting connection 'lo' (6a1055b1-2674-4e8e-9fff-1fce9dcc1052)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3758] device (eth0): carrier: link connected
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3761] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3767] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3768] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3774] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3780] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3786] device (eth1): carrier: link connected
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3790] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3795] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50) (indicated)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3796] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3801] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3807] device (eth1): Activation: starting connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3812] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 04:42:45 np0005593294 systemd[1]: Started Network Manager.
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3819] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3822] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3827] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3830] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3833] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3836] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3838] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3851] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3861] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3865] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3874] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3890] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3903] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3905] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3910] device (lo): Activation: successful, device activated.
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3918] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3922] dhcp4 (eth0): state changed new lease, address=38.129.56.30
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3927] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3931] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3933] device (eth1): Activation: successful, device activated.
Jan 23 04:42:45 np0005593294 systemd[1]: Starting Network Manager Wait Online...
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.3951] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.5601] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.5638] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.5640] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.5644] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.5646] device (eth0): Activation: successful, device activated.
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.5651] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 04:42:45 np0005593294 NetworkManager[48978]: <info>  [1769161365.5653] manager: startup complete
Jan 23 04:42:45 np0005593294 systemd[1]: Finished Network Manager Wait Online.
Jan 23 04:42:46 np0005593294 python3.9[49186]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:42:53 np0005593294 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:42:53 np0005593294 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:42:53 np0005593294 systemd[1]: Reloading.
Jan 23 04:42:53 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:42:53 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:42:53 np0005593294 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:42:55 np0005593294 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:42:55 np0005593294 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:42:55 np0005593294 systemd[1]: run-re0ba4bfdb04f412b91ccd3f0960499de.service: Deactivated successfully.
Jan 23 04:42:55 np0005593294 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:43:00 np0005593294 python3.9[49645]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:43:01 np0005593294 python3.9[49797]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:02 np0005593294 python3.9[49951]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:02 np0005593294 python3.9[50103]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:03 np0005593294 python3.9[50255]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:04 np0005593294 python3.9[50407]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:04 np0005593294 python3.9[50559]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:05 np0005593294 python3.9[50682]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161384.2408855-643-241763987821859/.source _original_basename=.yiyg6a5s follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:06 np0005593294 python3.9[50834]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:07 np0005593294 python3.9[50986]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 23 04:43:07 np0005593294 python3.9[51138]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:10 np0005593294 python3.9[51565]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 23 04:43:11 np0005593294 ansible-async_wrapper.py[51740]: Invoked with j728794609016 300 /home/zuul/.ansible/tmp/ansible-tmp-1769161390.6718836-841-8122361076958/AnsiballZ_edpm_os_net_config.py _
Jan 23 04:43:11 np0005593294 ansible-async_wrapper.py[51743]: Starting module and watcher
Jan 23 04:43:11 np0005593294 ansible-async_wrapper.py[51743]: Start watching 51744 (300)
Jan 23 04:43:11 np0005593294 ansible-async_wrapper.py[51744]: Start module (51744)
Jan 23 04:43:11 np0005593294 ansible-async_wrapper.py[51740]: Return async_wrapper task started.
Jan 23 04:43:11 np0005593294 python3.9[51745]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 23 04:43:12 np0005593294 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 23 04:43:12 np0005593294 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 23 04:43:12 np0005593294 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 23 04:43:12 np0005593294 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 23 04:43:12 np0005593294 kernel: cfg80211: failed to load regulatory.db
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7285] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7304] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7853] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7855] audit: op="connection-add" uuid="fcb52b2f-b59b-4641-8ef0-a8a3fe18cf9d" name="br-ex-br" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7869] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7870] audit: op="connection-add" uuid="b85b0313-6538-4d60-ae77-e76f4e59afd5" name="br-ex-port" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7880] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7881] audit: op="connection-add" uuid="83ff4c95-1394-48ce-bd5f-a5049c430383" name="eth1-port" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7891] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7892] audit: op="connection-add" uuid="c34547fa-5413-4521-a86b-c27c1e22e373" name="vlan20-port" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7903] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7904] audit: op="connection-add" uuid="fde3432a-6a8b-4ca7-a27a-9806a9829092" name="vlan21-port" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7914] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7915] audit: op="connection-add" uuid="8178505a-1b68-430c-82b2-c078deaaa866" name="vlan22-port" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7925] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7926] audit: op="connection-add" uuid="b7d2bc4e-ab13-4715-bbc4-69fde53fe582" name="vlan23-port" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7943] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7957] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.7958] audit: op="connection-add" uuid="15ca7997-d48c-49d0-811d-2a7146518225" name="br-ex-if" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8003] audit: op="connection-update" uuid="f0d37197-be61-575f-8210-b0dbd6f4eb50" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,ipv4.dns,ipv4.addresses,ipv4.routes,ipv4.routing-rules,ipv4.method,ipv4.never-default,connection.port-type,connection.master,connection.controller,connection.slave-type,connection.timestamp,ipv6.dns,ipv6.addresses,ipv6.routes,ipv6.routing-rules,ipv6.method,ipv6.addr-gen-mode" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8025] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8027] audit: op="connection-add" uuid="441fef83-1900-4302-8a85-ab4614af5f62" name="vlan20-if" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8046] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8047] audit: op="connection-add" uuid="7aa698c4-7f45-4cde-9381-28f8d723332a" name="vlan21-if" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8066] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8067] audit: op="connection-add" uuid="21ce2276-23b7-471d-a22a-02b98a19bbe0" name="vlan22-if" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8087] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8088] audit: op="connection-add" uuid="3a9b6f96-dcfa-44cc-acda-75f5e6b470a6" name="vlan23-if" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8101] audit: op="connection-delete" uuid="c9ce933b-996c-3254-bccc-8d3373d274f1" name="Wired connection 1" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8113] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8116] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8124] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8130] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (fcb52b2f-b59b-4641-8ef0-a8a3fe18cf9d)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8131] audit: op="connection-activate" uuid="fcb52b2f-b59b-4641-8ef0-a8a3fe18cf9d" name="br-ex-br" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8133] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8134] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8140] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8144] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (b85b0313-6538-4d60-ae77-e76f4e59afd5)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8146] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8147] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8151] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8155] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (83ff4c95-1394-48ce-bd5f-a5049c430383)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8157] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8158] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8164] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8169] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (c34547fa-5413-4521-a86b-c27c1e22e373)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8171] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8172] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8178] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8182] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (fde3432a-6a8b-4ca7-a27a-9806a9829092)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8184] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8185] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8192] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8197] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (8178505a-1b68-430c-82b2-c078deaaa866)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8199] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8200] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8206] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8210] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (b7d2bc4e-ab13-4715-bbc4-69fde53fe582)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8211] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8213] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8216] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8224] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8225] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8228] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8232] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (15ca7997-d48c-49d0-811d-2a7146518225)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8233] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8236] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8238] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8239] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8240] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8251] device (eth1): disconnecting for new activation request.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8251] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8254] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8256] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8257] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8260] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8261] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8264] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8268] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (441fef83-1900-4302-8a85-ab4614af5f62)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8269] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8271] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8273] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8274] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8277] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8278] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8281] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8285] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (7aa698c4-7f45-4cde-9381-28f8d723332a)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8286] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8290] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8291] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8292] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8296] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8297] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8300] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8306] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (21ce2276-23b7-471d-a22a-02b98a19bbe0)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8307] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8310] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8312] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8313] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8317] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <warn>  [1769161393.8318] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8324] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8334] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (3a9b6f96-dcfa-44cc-acda-75f5e6b470a6)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8336] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8341] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8343] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8345] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8347] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8363] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8366] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8369] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8371] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8384] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8389] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8392] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8394] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8396] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8400] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 kernel: ovs-system: entered promiscuous mode
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8405] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8407] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8409] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8414] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8418] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8421] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8423] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8427] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8432] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 kernel: Timeout policy base is empty
Jan 23 04:43:13 np0005593294 systemd-udevd[51749]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8434] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8439] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8443] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8448] dhcp4 (eth0): canceled DHCP transaction
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8448] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8448] dhcp4 (eth0): state changed no lease
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8450] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8461] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8470] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51746 uid=0 result="fail" reason="Device is not activated"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8474] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8482] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8491] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8499] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8501] dhcp4 (eth0): state changed new lease, address=38.129.56.30
Jan 23 04:43:13 np0005593294 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8562] device (eth1): disconnecting for new activation request.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8563] audit: op="connection-activate" uuid="f0d37197-be61-575f-8210-b0dbd6f4eb50" name="ci-private-network" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8677] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8678] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8811] device (eth1): Activation: starting connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50)
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8815] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8816] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8817] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8818] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8818] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8819] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8820] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8825] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8827] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8832] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8836] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8839] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8841] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 kernel: br-ex: entered promiscuous mode
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8844] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8847] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8850] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8853] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8856] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8858] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8861] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8864] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8867] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8870] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8882] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8899] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8902] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8940] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 kernel: vlan22: entered promiscuous mode
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8946] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.8950] device (eth1): Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 systemd-udevd[51751]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9005] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9020] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 kernel: vlan21: entered promiscuous mode
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9047] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9049] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9052] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9092] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9102] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9132] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9134] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9137] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 kernel: vlan23: entered promiscuous mode
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9174] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9188] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9217] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9218] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9221] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 kernel: vlan20: entered promiscuous mode
Jan 23 04:43:13 np0005593294 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9281] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9292] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9317] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9318] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9323] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9366] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9379] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9404] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9406] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593294 NetworkManager[48978]: <info>  [1769161393.9410] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:15 np0005593294 NetworkManager[48978]: <info>  [1769161395.0511] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 04:43:15 np0005593294 NetworkManager[48978]: <info>  [1769161395.2161] checkpoint[0x555d44294950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 23 04:43:15 np0005593294 NetworkManager[48978]: <info>  [1769161395.2167] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 04:43:15 np0005593294 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 04:43:15 np0005593294 python3.9[52103]: ansible-ansible.legacy.async_status Invoked with jid=j728794609016.51740 mode=status _async_dir=/root/.ansible_async
Jan 23 04:43:15 np0005593294 NetworkManager[48978]: <info>  [1769161395.5173] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51746 uid=0 result="success"
Jan 23 04:43:15 np0005593294 NetworkManager[48978]: <info>  [1769161395.5185] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51746 uid=0 result="success"
Jan 23 04:43:15 np0005593294 NetworkManager[48978]: <info>  [1769161395.7595] audit: op="networking-control" arg="global-dns-configuration" pid=51746 uid=0 result="success"
Jan 23 04:43:15 np0005593294 NetworkManager[48978]: <info>  [1769161395.7629] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 23 04:43:15 np0005593294 NetworkManager[48978]: <info>  [1769161395.7656] audit: op="networking-control" arg="global-dns-configuration" pid=51746 uid=0 result="success"
Jan 23 04:43:15 np0005593294 NetworkManager[48978]: <info>  [1769161395.8095] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51746 uid=0 result="success"
Jan 23 04:43:16 np0005593294 NetworkManager[48978]: <info>  [1769161395.9999] checkpoint[0x555d44294a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 23 04:43:16 np0005593294 NetworkManager[48978]: <info>  [1769161396.0003] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51746 uid=0 result="success"
Jan 23 04:43:16 np0005593294 ansible-async_wrapper.py[51744]: Module complete (51744)
Jan 23 04:43:16 np0005593294 ansible-async_wrapper.py[51743]: Done in kid B.
Jan 23 04:43:18 np0005593294 python3.9[52212]: ansible-ansible.legacy.async_status Invoked with jid=j728794609016.51740 mode=status _async_dir=/root/.ansible_async
Jan 23 04:43:19 np0005593294 python3.9[52312]: ansible-ansible.legacy.async_status Invoked with jid=j728794609016.51740 mode=cleanup _async_dir=/root/.ansible_async
Jan 23 04:43:20 np0005593294 python3.9[52464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:20 np0005593294 python3.9[52587]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161399.784632-922-40190662143048/.source.returncode _original_basename=.r_ria67y follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:21 np0005593294 python3.9[52739]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:22 np0005593294 python3.9[52862]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161401.1368065-970-98063642981915/.source.cfg _original_basename=.j53kx5k0 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:23 np0005593294 python3.9[53015]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:43:23 np0005593294 systemd[1]: Reloading Network Manager...
Jan 23 04:43:23 np0005593294 NetworkManager[48978]: <info>  [1769161403.2076] audit: op="reload" arg="0" pid=53019 uid=0 result="success"
Jan 23 04:43:23 np0005593294 NetworkManager[48978]: <info>  [1769161403.2082] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 23 04:43:23 np0005593294 systemd[1]: Reloaded Network Manager.
Jan 23 04:43:23 np0005593294 systemd[1]: session-11.scope: Deactivated successfully.
Jan 23 04:43:23 np0005593294 systemd[1]: session-11.scope: Consumed 50.625s CPU time.
Jan 23 04:43:23 np0005593294 systemd-logind[807]: Session 11 logged out. Waiting for processes to exit.
Jan 23 04:43:23 np0005593294 systemd-logind[807]: Removed session 11.
Jan 23 04:43:29 np0005593294 systemd-logind[807]: New session 12 of user zuul.
Jan 23 04:43:29 np0005593294 systemd[1]: Started Session 12 of User zuul.
Jan 23 04:43:30 np0005593294 python3.9[53203]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:43:32 np0005593294 python3.9[53358]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:43:33 np0005593294 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:43:33 np0005593294 python3.9[53553]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:43:33 np0005593294 systemd-logind[807]: Session 12 logged out. Waiting for processes to exit.
Jan 23 04:43:33 np0005593294 systemd[1]: session-12.scope: Deactivated successfully.
Jan 23 04:43:33 np0005593294 systemd[1]: session-12.scope: Consumed 2.541s CPU time.
Jan 23 04:43:33 np0005593294 systemd-logind[807]: Removed session 12.
Jan 23 04:43:39 np0005593294 systemd-logind[807]: New session 13 of user zuul.
Jan 23 04:43:39 np0005593294 systemd[1]: Started Session 13 of User zuul.
Jan 23 04:43:40 np0005593294 python3.9[53734]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:43:41 np0005593294 python3.9[53888]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:43:42 np0005593294 python3.9[54045]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:43:43 np0005593294 python3.9[54129]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:43:46 np0005593294 python3.9[54283]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:43:47 np0005593294 python3.9[54478]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:48 np0005593294 python3.9[54630]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:43:48 np0005593294 podman[54631]: 2026-01-23 09:43:48.413618339 +0000 UTC m=+0.054328689 system refresh
Jan 23 04:43:49 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:43:49 np0005593294 python3.9[54791]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:50 np0005593294 python3.9[54914]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161428.7140532-193-247961873399552/.source.json follow=False _original_basename=podman_network_config.j2 checksum=6d3b755833236f070a036449324fcb17c483d383 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:51 np0005593294 python3.9[55066]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:51 np0005593294 python3.9[55189]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161430.539225-238-37214369718531/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:52 np0005593294 python3.9[55341]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:53 np0005593294 python3.9[55493]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:53 np0005593294 python3.9[55645]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:54 np0005593294 python3.9[55797]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:55 np0005593294 python3.9[55949]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:43:58 np0005593294 python3.9[56104]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:43:58 np0005593294 python3.9[56258]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:43:59 np0005593294 python3.9[56410]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:44:00 np0005593294 python3.9[56562]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:44:01 np0005593294 python3.9[56715]: ansible-service_facts Invoked
Jan 23 04:44:01 np0005593294 network[56732]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:44:01 np0005593294 network[56733]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:44:01 np0005593294 network[56734]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:44:06 np0005593294 python3.9[57186]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:44:10 np0005593294 python3.9[57339]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 04:44:12 np0005593294 python3.9[57491]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:13 np0005593294 python3.9[57616]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161451.899059-671-721447290078/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:13 np0005593294 python3.9[57770]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:14 np0005593294 python3.9[57895]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161453.2719088-716-245324001402472/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:16 np0005593294 python3.9[58049]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:18 np0005593294 python3.9[58203]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:44:19 np0005593294 python3.9[58287]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:44:20 np0005593294 python3.9[58441]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:44:21 np0005593294 python3.9[58525]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:44:21 np0005593294 systemd[1]: Stopping NTP client/server...
Jan 23 04:44:21 np0005593294 chronyd[781]: chronyd exiting
Jan 23 04:44:21 np0005593294 systemd[1]: chronyd.service: Deactivated successfully.
Jan 23 04:44:21 np0005593294 systemd[1]: Stopped NTP client/server.
Jan 23 04:44:21 np0005593294 systemd[1]: Starting NTP client/server...
Jan 23 04:44:21 np0005593294 chronyd[58533]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 04:44:21 np0005593294 chronyd[58533]: Frequency -31.645 +/- 0.148 ppm read from /var/lib/chrony/drift
Jan 23 04:44:21 np0005593294 chronyd[58533]: Loaded seccomp filter (level 2)
Jan 23 04:44:21 np0005593294 systemd[1]: Started NTP client/server.
Jan 23 04:44:22 np0005593294 systemd[1]: session-13.scope: Deactivated successfully.
Jan 23 04:44:22 np0005593294 systemd[1]: session-13.scope: Consumed 26.864s CPU time.
Jan 23 04:44:22 np0005593294 systemd-logind[807]: Session 13 logged out. Waiting for processes to exit.
Jan 23 04:44:22 np0005593294 systemd-logind[807]: Removed session 13.
Jan 23 04:44:29 np0005593294 systemd-logind[807]: New session 14 of user zuul.
Jan 23 04:44:29 np0005593294 systemd[1]: Started Session 14 of User zuul.
Jan 23 04:44:29 np0005593294 python3.9[58714]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:30 np0005593294 python3.9[58866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:31 np0005593294 python3.9[58989]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161470.173351-58-281226781184061/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:31 np0005593294 systemd[1]: session-14.scope: Deactivated successfully.
Jan 23 04:44:31 np0005593294 systemd[1]: session-14.scope: Consumed 1.769s CPU time.
Jan 23 04:44:31 np0005593294 systemd-logind[807]: Session 14 logged out. Waiting for processes to exit.
Jan 23 04:44:31 np0005593294 systemd-logind[807]: Removed session 14.
Jan 23 04:44:37 np0005593294 systemd-logind[807]: New session 15 of user zuul.
Jan 23 04:44:37 np0005593294 systemd[1]: Started Session 15 of User zuul.
Jan 23 04:44:38 np0005593294 python3.9[59168]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:44:39 np0005593294 python3.9[59324]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:40 np0005593294 python3.9[59499]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:41 np0005593294 python3.9[59622]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769161480.1817193-79-192264425773339/.source.json _original_basename=.hjzk4ntc follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:42 np0005593294 python3.9[59774]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:43 np0005593294 python3.9[59897]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161482.0245578-148-162811431184813/.source _original_basename=.xnr2vyk_ follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:43 np0005593294 python3.9[60049]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:44:44 np0005593294 python3.9[60201]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:45 np0005593294 python3.9[60324]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161484.1323788-220-46440115688947/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:44:45 np0005593294 python3.9[60476]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:46 np0005593294 python3.9[60599]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161485.273861-220-17542410538509/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:44:47 np0005593294 python3.9[60751]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:48 np0005593294 python3.9[60903]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:48 np0005593294 python3.9[61026]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161487.7552755-331-49845857292096/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:49 np0005593294 python3.9[61179]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:49 np0005593294 python3.9[61302]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161488.9764304-376-141887458440941/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:51 np0005593294 python3.9[61454]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:44:51 np0005593294 systemd[1]: Reloading.
Jan 23 04:44:51 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:44:51 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:44:51 np0005593294 systemd[1]: Reloading.
Jan 23 04:44:51 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:44:51 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:44:51 np0005593294 systemd[1]: Starting EDPM Container Shutdown...
Jan 23 04:44:51 np0005593294 systemd[1]: Finished EDPM Container Shutdown.
Jan 23 04:44:52 np0005593294 python3.9[61683]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:53 np0005593294 python3.9[61806]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161491.9637334-445-15161176188205/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:53 np0005593294 python3.9[61958]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:54 np0005593294 python3.9[62081]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161493.2674856-490-135175618482859/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:55 np0005593294 python3.9[62233]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:44:55 np0005593294 systemd[1]: Reloading.
Jan 23 04:44:55 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:44:55 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:44:55 np0005593294 systemd[1]: Reloading.
Jan 23 04:44:55 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:44:55 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:44:55 np0005593294 systemd[1]: Starting Create netns directory...
Jan 23 04:44:55 np0005593294 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:44:55 np0005593294 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:44:55 np0005593294 systemd[1]: Finished Create netns directory.
Jan 23 04:44:56 np0005593294 python3.9[62458]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:44:56 np0005593294 network[62475]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:44:56 np0005593294 network[62476]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:44:56 np0005593294 network[62477]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:45:01 np0005593294 python3.9[62740]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:45:01 np0005593294 systemd[1]: Reloading.
Jan 23 04:45:01 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:45:01 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:45:01 np0005593294 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 23 04:45:02 np0005593294 iptables.init[62783]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 23 04:45:02 np0005593294 iptables.init[62783]: iptables: Flushing firewall rules: [  OK  ]
Jan 23 04:45:02 np0005593294 systemd[1]: iptables.service: Deactivated successfully.
Jan 23 04:45:02 np0005593294 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 23 04:45:02 np0005593294 python3.9[62980]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:45:03 np0005593294 python3.9[63134]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:45:04 np0005593294 systemd[1]: Reloading.
Jan 23 04:45:04 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:45:04 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:45:04 np0005593294 systemd[1]: Starting Netfilter Tables...
Jan 23 04:45:04 np0005593294 systemd[1]: Finished Netfilter Tables.
Jan 23 04:45:05 np0005593294 python3.9[63326]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:06 np0005593294 python3.9[63479]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:07 np0005593294 python3.9[63604]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161506.1696353-697-261973249454088/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:07 np0005593294 python3.9[63757]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:45:08 np0005593294 systemd[1]: Reloading OpenSSH server daemon...
Jan 23 04:45:08 np0005593294 systemd[1]: Reloaded OpenSSH server daemon.
Jan 23 04:45:08 np0005593294 python3.9[63913]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:09 np0005593294 python3.9[64065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:10 np0005593294 python3.9[64188]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161509.0550687-790-90975495699921/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:11 np0005593294 python3.9[64340]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 04:45:11 np0005593294 systemd[1]: Starting Time & Date Service...
Jan 23 04:45:11 np0005593294 systemd[1]: Started Time & Date Service.
Jan 23 04:45:11 np0005593294 python3.9[64496]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:13 np0005593294 python3.9[64648]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:14 np0005593294 python3.9[64771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161513.1783187-895-187378640113573/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:14 np0005593294 python3.9[64923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:15 np0005593294 python3.9[65046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161514.3688216-940-124432934527201/.source.yaml _original_basename=.g4g35s3q follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:16 np0005593294 python3.9[65198]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:16 np0005593294 python3.9[65321]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161515.5931053-985-206815604767094/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:17 np0005593294 python3.9[65473]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:18 np0005593294 python3.9[65626]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:19 np0005593294 python3[65779]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:45:19 np0005593294 python3.9[65931]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:20 np0005593294 python3.9[66054]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161519.4919567-1102-115947755542535/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:21 np0005593294 python3.9[66206]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:21 np0005593294 python3.9[66329]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161520.7631946-1147-30328738945675/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:22 np0005593294 python3.9[66481]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:23 np0005593294 python3.9[66604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161522.0188193-1192-129481673749233/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:23 np0005593294 python3.9[66756]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:24 np0005593294 python3.9[66879]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161523.321505-1237-4778494083511/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:25 np0005593294 python3.9[67031]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:25 np0005593294 python3.9[67154]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161524.5952666-1282-75952755557257/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:26 np0005593294 python3.9[67306]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:27 np0005593294 python3.9[67458]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:28 np0005593294 python3.9[67617]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:28 np0005593294 python3.9[67770]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:29 np0005593294 python3.9[67922]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:30 np0005593294 python3.9[68074]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:45:31 np0005593294 python3.9[68227]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:45:31 np0005593294 systemd[1]: session-15.scope: Deactivated successfully.
Jan 23 04:45:31 np0005593294 systemd[1]: session-15.scope: Consumed 36.479s CPU time.
Jan 23 04:45:31 np0005593294 systemd-logind[807]: Session 15 logged out. Waiting for processes to exit.
Jan 23 04:45:31 np0005593294 systemd-logind[807]: Removed session 15.
Jan 23 04:45:37 np0005593294 systemd-logind[807]: New session 16 of user zuul.
Jan 23 04:45:37 np0005593294 systemd[1]: Started Session 16 of User zuul.
Jan 23 04:45:38 np0005593294 python3.9[68408]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 04:45:38 np0005593294 python3.9[68560]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:45:40 np0005593294 python3.9[68712]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:45:41 np0005593294 python3.9[68864]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+cj2so8SS29oYZ1K+7e02qi6fVkGXJzGMkIN9mgJPLCBtQ6vpBYEObTZZXuMIHhdiMUAp6RDjs11OXDkAB9R7e2ncjMKn7J2EHbmceT7rNq9L0w+QaLKFxl+xdJQ9QtO9ioNgJFXXQZt/IOeE8S4I5yhEM5jn+YEW0LPbp99Wz1d1Ob4GI1t0hCEv/4ayC3nRIXkuIhl7mrV0s22F8NE8f0hZZKaw1u8xmmpbD8ZVBsC6cxWE3kIQBmHu8q9tylaZjLsjGxBDUF9ko3bxeppvLPDMem89VLQCWbgmOHl5ZIPsyNglusTIBUp8uA7g+Agz1uMojClMHnsZl68WjbCAVcRA9y/UgXphGyEYZCUJMv8CjYKzxriyHALZl6YFSyC5ELlEAxL8fyTwtXhQ1+e/lI9Ak3n4suC6JyH0NQ27MPIf7riyUFJLw9lZaDerZOkvI7/Y2PfRvdfyZ57g/xgGeLY0Ch30SFVC04lNXIpsOWbLBOg0BMP9ZiciAYAF9Yc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIreWuVcekgp7kF5pU+4TIKLHZyhuqd4Ly312ExEA5EG#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJWfXOTsTXqDhdGhW7VcUXsYqCS7TzCPyaa9/dA9e0xKjnni1/GRM8FdYXWYbGsNnBQFWk3/pXD6sj3jKzK34AM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWbrXZxuAw0n/xJmOvWW/Qbg53ya2CuJKzcHA+OvDpHLHGxkEuiUhwKvqUbfSTzn0o1M00OYITJIvZVINGRtQC7hGvBPWLVBON097mcmnju857I72U3dGdvGhnEUHyrglCV+xSkafQTTlnY9B59EKImUs/kiwRy3cYDWkCgthJgiPA4QSw6WrzaqpY2ET+7n+yY31EOagGA3ufW43qFbHX4diFuXpS1I1PLvvA4KINlMlsFcyR29j4nQk/vb5hMpLmBOlfVH16CXZC98a0ltp9ib7F3e1Wjdogj92kxwfQMYIeQEBp11Tc/PY5U90J51oyk8xYOKfsP3+r9yczmfRDjwR3+tzUMKyZYAsKQVcOGQC7x9sEXg3mBeXRVrlIVZFMuNVcYq4CY40fDIybcI25GxgRbQR7ZUWODG1SL7RF02Z+LQB6APXkzxdQUWLWPryj/EtOgnHQ1I0+BJTWrqGkKbSj41jhRTfS+MZvRXAJ+fNyZFhpkHo54DrCii4cbyM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGRPkwTcFVg/dIKRq29iWBfkoVFqIQ1pXOCPxfcGWRFF#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGf/hJ2dg/PRwojw63FLyKqua+ChKP+2bc7Eb0p70H6ve1elFVeY8lVRXx33JWc2m/XfgSWPNcUs9zBG8QcFVak=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA/6JnQZ3CFC7xgv4DrvdZizVbVnsolKcWkvqzGu1hFHGmOEb7ehbxGPHBnp2N9iRf13H12EI0qNI6A2f44V0oXE3SP+fpJ6PVYQRQpKqTEiweqZaHEyYE2FnKy0HDQisg5hwr1egYLjGXChdkyqWSokL1LqaCyD2+EcOzUvC/GuVQ7eQnQBIGBpYAnNzS/64KKOZ0+0soOPJGxVCma6JN/2GcCunX6j3HmkOOQeuEFETXfUPHh1ylu2+3yINl34ERJN5YwgR/S+BKENOsJTu5XkYTCvc90CuvfkoF9K5Y2yE5nKwZaSf7n2SbUPil2Zph4l7opsd5IKxi6k2mVzw/CO2NHr136BZ06+sKXytDgorWqWzqnci8zfxeYF3D7q7AXD+IDVMP5T6op93oS2enAQFHG1vTLB0otQqnxUgNANbJkrKgXAS8G8I1m2sPz+qOFuuZa2/nqhzrd6/DEur5VoW6n9c/OcrbfapLEzD1jQDmsQI7oZkT++dt3Ogb3Vk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIII1sLqY7Nqi1A3CKXLokfn1vrns/lK1gUkDNSlbek2o#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9QZXHUsthFMKA5Si4Htl7MIwK0G4VAltQgbo39JJHrgD7h27U1jbnuJQ1S2bBX8FMSkqf5TPmM7Gr9QOATO+4=#012 create=True mode=0644 path=/tmp/ansible.ncrgoy7m state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:41 np0005593294 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 04:45:42 np0005593294 python3.9[69018]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ncrgoy7m' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:42 np0005593294 python3.9[69172]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ncrgoy7m state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:43 np0005593294 systemd[1]: session-16.scope: Deactivated successfully.
Jan 23 04:45:43 np0005593294 systemd[1]: session-16.scope: Consumed 3.392s CPU time.
Jan 23 04:45:43 np0005593294 systemd-logind[807]: Session 16 logged out. Waiting for processes to exit.
Jan 23 04:45:43 np0005593294 systemd-logind[807]: Removed session 16.
Jan 23 04:45:49 np0005593294 systemd-logind[807]: New session 17 of user zuul.
Jan 23 04:45:49 np0005593294 systemd[1]: Started Session 17 of User zuul.
Jan 23 04:45:50 np0005593294 python3.9[69350]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:45:52 np0005593294 python3.9[69506]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 04:45:52 np0005593294 python3.9[69660]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:45:53 np0005593294 python3.9[69813]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:54 np0005593294 python3.9[69966]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:45:55 np0005593294 python3.9[70120]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:56 np0005593294 python3.9[70275]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:57 np0005593294 systemd[1]: session-17.scope: Deactivated successfully.
Jan 23 04:45:57 np0005593294 systemd[1]: session-17.scope: Consumed 4.521s CPU time.
Jan 23 04:45:57 np0005593294 systemd-logind[807]: Session 17 logged out. Waiting for processes to exit.
Jan 23 04:45:57 np0005593294 systemd-logind[807]: Removed session 17.
Jan 23 04:46:02 np0005593294 systemd-logind[807]: New session 18 of user zuul.
Jan 23 04:46:02 np0005593294 systemd[1]: Started Session 18 of User zuul.
Jan 23 04:46:03 np0005593294 python3.9[70453]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:46:04 np0005593294 python3.9[70609]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:46:05 np0005593294 python3.9[70693]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:46:08 np0005593294 python3.9[70844]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:46:10 np0005593294 python3.9[70995]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:46:11 np0005593294 python3.9[71145]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:46:11 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:46:11 np0005593294 python3.9[71297]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:46:12 np0005593294 systemd[1]: session-18.scope: Deactivated successfully.
Jan 23 04:46:12 np0005593294 systemd[1]: session-18.scope: Consumed 6.367s CPU time.
Jan 23 04:46:12 np0005593294 systemd-logind[807]: Session 18 logged out. Waiting for processes to exit.
Jan 23 04:46:12 np0005593294 systemd-logind[807]: Removed session 18.
Jan 23 04:46:20 np0005593294 systemd-logind[807]: New session 19 of user zuul.
Jan 23 04:46:20 np0005593294 systemd[1]: Started Session 19 of User zuul.
Jan 23 04:46:26 np0005593294 python3[72064]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:46:28 np0005593294 python3[72159]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 04:46:30 np0005593294 python3[72186]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 04:46:30 np0005593294 python3[72212]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:46:30 np0005593294 kernel: loop: module loaded
Jan 23 04:46:30 np0005593294 kernel: loop3: detected capacity change from 0 to 41943040
Jan 23 04:46:31 np0005593294 python3[72248]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:46:31 np0005593294 lvm[72251]: PV /dev/loop3 not used.
Jan 23 04:46:31 np0005593294 lvm[72260]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:46:31 np0005593294 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 23 04:46:31 np0005593294 lvm[72262]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 23 04:46:31 np0005593294 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 23 04:46:32 np0005593294 python3[72340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:46:32 np0005593294 chronyd[58533]: Selected source 54.39.23.64 (pool.ntp.org)
Jan 23 04:46:32 np0005593294 python3[72413]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769161591.8796751-37004-63740275737266/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:46:33 np0005593294 python3[72463]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:46:33 np0005593294 systemd[1]: Reloading.
Jan 23 04:46:33 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:46:33 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:46:33 np0005593294 systemd[1]: Starting Ceph OSD losetup...
Jan 23 04:46:33 np0005593294 bash[72504]: /dev/loop3: [64513]:4328449 (/var/lib/ceph-osd-0.img)
Jan 23 04:46:33 np0005593294 systemd[1]: Finished Ceph OSD losetup.
Jan 23 04:46:33 np0005593294 lvm[72505]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:46:33 np0005593294 lvm[72505]: VG ceph_vg0 finished
Jan 23 04:46:36 np0005593294 python3[72529]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:48:43 np0005593294 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 04:48:43 np0005593294 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 04:48:43 np0005593294 systemd-logind[807]: New session 20 of user ceph-admin.
Jan 23 04:48:43 np0005593294 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 04:48:43 np0005593294 systemd[1]: Starting User Manager for UID 42477...
Jan 23 04:48:43 np0005593294 systemd[72579]: Queued start job for default target Main User Target.
Jan 23 04:48:43 np0005593294 systemd[72579]: Created slice User Application Slice.
Jan 23 04:48:43 np0005593294 systemd[72579]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:48:43 np0005593294 systemd[72579]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:48:43 np0005593294 systemd[72579]: Reached target Paths.
Jan 23 04:48:43 np0005593294 systemd[72579]: Reached target Timers.
Jan 23 04:48:43 np0005593294 systemd[72579]: Starting D-Bus User Message Bus Socket...
Jan 23 04:48:43 np0005593294 systemd[72579]: Starting Create User's Volatile Files and Directories...
Jan 23 04:48:43 np0005593294 systemd[72579]: Finished Create User's Volatile Files and Directories.
Jan 23 04:48:43 np0005593294 systemd[72579]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:48:43 np0005593294 systemd[72579]: Reached target Sockets.
Jan 23 04:48:43 np0005593294 systemd[72579]: Reached target Basic System.
Jan 23 04:48:43 np0005593294 systemd[1]: Started User Manager for UID 42477.
Jan 23 04:48:43 np0005593294 systemd[72579]: Reached target Main User Target.
Jan 23 04:48:43 np0005593294 systemd[72579]: Startup finished in 121ms.
Jan 23 04:48:44 np0005593294 systemd[1]: Started Session 20 of User ceph-admin.
Jan 23 04:48:44 np0005593294 systemd-logind[807]: New session 22 of user ceph-admin.
Jan 23 04:48:44 np0005593294 systemd[1]: Started Session 22 of User ceph-admin.
Jan 23 04:48:44 np0005593294 systemd-logind[807]: New session 23 of user ceph-admin.
Jan 23 04:48:44 np0005593294 systemd[1]: Started Session 23 of User ceph-admin.
Jan 23 04:48:44 np0005593294 systemd-logind[807]: New session 24 of user ceph-admin.
Jan 23 04:48:44 np0005593294 systemd[1]: Started Session 24 of User ceph-admin.
Jan 23 04:48:45 np0005593294 systemd-logind[807]: New session 25 of user ceph-admin.
Jan 23 04:48:45 np0005593294 systemd[1]: Started Session 25 of User ceph-admin.
Jan 23 04:48:45 np0005593294 systemd-logind[807]: New session 26 of user ceph-admin.
Jan 23 04:48:45 np0005593294 systemd[1]: Started Session 26 of User ceph-admin.
Jan 23 04:48:45 np0005593294 systemd-logind[807]: New session 27 of user ceph-admin.
Jan 23 04:48:45 np0005593294 systemd[1]: Started Session 27 of User ceph-admin.
Jan 23 04:48:46 np0005593294 systemd-logind[807]: New session 28 of user ceph-admin.
Jan 23 04:48:46 np0005593294 systemd[1]: Started Session 28 of User ceph-admin.
Jan 23 04:48:46 np0005593294 systemd-logind[807]: New session 29 of user ceph-admin.
Jan 23 04:48:46 np0005593294 systemd[1]: Started Session 29 of User ceph-admin.
Jan 23 04:48:46 np0005593294 systemd-logind[807]: New session 30 of user ceph-admin.
Jan 23 04:48:46 np0005593294 systemd[1]: Started Session 30 of User ceph-admin.
Jan 23 04:48:48 np0005593294 systemd-logind[807]: New session 31 of user ceph-admin.
Jan 23 04:48:48 np0005593294 systemd[1]: Started Session 31 of User ceph-admin.
Jan 23 04:48:48 np0005593294 systemd-logind[807]: New session 32 of user ceph-admin.
Jan 23 04:48:48 np0005593294 systemd[1]: Started Session 32 of User ceph-admin.
Jan 23 04:48:48 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:48:49 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:48:49 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:48:50 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:48:50 np0005593294 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73148 (sysctl)
Jan 23 04:48:50 np0005593294 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 23 04:48:50 np0005593294 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 23 04:48:51 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:48:51 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:48:54 np0005593294 systemd[1]: var-lib-containers-storage-overlay-compat3883837964-lower\x2dmapped.mount: Deactivated successfully.
Jan 23 04:49:17 np0005593294 podman[73325]: 2026-01-23 09:49:17.915282154 +0000 UTC m=+26.365516815 container create 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 23 04:49:17 np0005593294 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck240398213-merged.mount: Deactivated successfully.
Jan 23 04:49:17 np0005593294 podman[73325]: 2026-01-23 09:49:17.897979938 +0000 UTC m=+26.348214619 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:17 np0005593294 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 23 04:49:17 np0005593294 systemd[1]: Started libpod-conmon-49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f.scope.
Jan 23 04:49:17 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:18 np0005593294 podman[73325]: 2026-01-23 09:49:18.01155549 +0000 UTC m=+26.461790251 container init 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 23 04:49:18 np0005593294 podman[73325]: 2026-01-23 09:49:18.023111585 +0000 UTC m=+26.473346246 container start 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:49:18 np0005593294 podman[73325]: 2026-01-23 09:49:18.027193423 +0000 UTC m=+26.477428144 container attach 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:49:18 np0005593294 tender_kapitsa[73407]: 167 167
Jan 23 04:49:18 np0005593294 systemd[1]: libpod-49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f.scope: Deactivated successfully.
Jan 23 04:49:18 np0005593294 podman[73325]: 2026-01-23 09:49:18.033292965 +0000 UTC m=+26.483527666 container died 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:49:18 np0005593294 systemd[1]: var-lib-containers-storage-overlay-05797ca99d44ba6745e4b9cc15b5014df2a4a0bccd229ef6043ead52cc06c67d-merged.mount: Deactivated successfully.
Jan 23 04:49:18 np0005593294 podman[73325]: 2026-01-23 09:49:18.084894343 +0000 UTC m=+26.535129034 container remove 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:49:18 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:18 np0005593294 systemd[1]: libpod-conmon-49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f.scope: Deactivated successfully.
Jan 23 04:49:18 np0005593294 podman[73430]: 2026-01-23 09:49:18.300359088 +0000 UTC m=+0.040482518 container create f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:49:18 np0005593294 systemd[1]: Started libpod-conmon-f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4.scope.
Jan 23 04:49:18 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:18 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a9efd00be1246942a1fb7f43342804d54e64fa2dfe4cc31eee8922f095b8156/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:18 np0005593294 podman[73430]: 2026-01-23 09:49:18.282101172 +0000 UTC m=+0.022224542 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:18 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a9efd00be1246942a1fb7f43342804d54e64fa2dfe4cc31eee8922f095b8156/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:18 np0005593294 podman[73430]: 2026-01-23 09:49:18.387758344 +0000 UTC m=+0.127881704 container init f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:49:18 np0005593294 podman[73430]: 2026-01-23 09:49:18.395532249 +0000 UTC m=+0.135655589 container start f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:49:18 np0005593294 podman[73430]: 2026-01-23 09:49:18.39907048 +0000 UTC m=+0.139193850 container attach f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]: [
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:    {
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        "available": false,
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        "being_replaced": false,
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        "ceph_device_lvm": false,
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        "lsm_data": {},
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        "lvs": [],
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        "path": "/dev/sr0",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        "rejected_reasons": [
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "Insufficient space (<5GB)",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "Has a FileSystem"
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        ],
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        "sys_api": {
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "actuators": null,
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "device_nodes": [
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:                "sr0"
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            ],
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "devname": "sr0",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "human_readable_size": "482.00 KB",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "id_bus": "ata",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "model": "QEMU DVD-ROM",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "nr_requests": "2",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "parent": "/dev/sr0",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "partitions": {},
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "path": "/dev/sr0",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "removable": "1",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "rev": "2.5+",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "ro": "0",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "rotational": "1",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "sas_address": "",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "sas_device_handle": "",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "scheduler_mode": "mq-deadline",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "sectors": 0,
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "sectorsize": "2048",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "size": 493568.0,
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "support_discard": "2048",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "type": "disk",
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:            "vendor": "QEMU"
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:        }
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]:    }
Jan 23 04:49:19 np0005593294 fervent_yalow[73446]: ]
Jan 23 04:49:19 np0005593294 systemd[1]: libpod-f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4.scope: Deactivated successfully.
Jan 23 04:49:19 np0005593294 podman[73430]: 2026-01-23 09:49:19.176339713 +0000 UTC m=+0.916463083 container died f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:49:19 np0005593294 systemd[1]: var-lib-containers-storage-overlay-2a9efd00be1246942a1fb7f43342804d54e64fa2dfe4cc31eee8922f095b8156-merged.mount: Deactivated successfully.
Jan 23 04:49:19 np0005593294 podman[73430]: 2026-01-23 09:49:19.232545025 +0000 UTC m=+0.972668365 container remove f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Jan 23 04:49:19 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:19 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:19 np0005593294 systemd[1]: libpod-conmon-f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4.scope: Deactivated successfully.
Jan 23 04:49:22 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:22 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:22 np0005593294 podman[75433]: 2026-01-23 09:49:22.260559577 +0000 UTC m=+0.043698469 container create bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:49:22 np0005593294 systemd[1]: Started libpod-conmon-bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd.scope.
Jan 23 04:49:22 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:22 np0005593294 podman[75433]: 2026-01-23 09:49:22.243369494 +0000 UTC m=+0.026508406 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:22 np0005593294 podman[75433]: 2026-01-23 09:49:22.340927481 +0000 UTC m=+0.124066403 container init bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Jan 23 04:49:22 np0005593294 podman[75433]: 2026-01-23 09:49:22.348935054 +0000 UTC m=+0.132073946 container start bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 04:49:22 np0005593294 podman[75433]: 2026-01-23 09:49:22.353175567 +0000 UTC m=+0.136314459 container attach bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 23 04:49:22 np0005593294 zealous_sutherland[75450]: 167 167
Jan 23 04:49:22 np0005593294 systemd[1]: libpod-bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd.scope: Deactivated successfully.
Jan 23 04:49:22 np0005593294 conmon[75450]: conmon bf358994fa0eaa3ceb96 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd.scope/container/memory.events
Jan 23 04:49:22 np0005593294 podman[75433]: 2026-01-23 09:49:22.357002959 +0000 UTC m=+0.140141891 container died bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:49:22 np0005593294 podman[75433]: 2026-01-23 09:49:22.449599799 +0000 UTC m=+0.232738731 container remove bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 23 04:49:22 np0005593294 systemd[1]: libpod-conmon-bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd.scope: Deactivated successfully.
Jan 23 04:49:22 np0005593294 systemd[1]: Reloading.
Jan 23 04:49:22 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:49:22 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:49:22 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:22 np0005593294 systemd[1]: Reloading.
Jan 23 04:49:22 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:49:22 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:49:23 np0005593294 systemd[1]: Reached target All Ceph clusters and services.
Jan 23 04:49:23 np0005593294 systemd[1]: Reloading.
Jan 23 04:49:23 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:49:23 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:49:23 np0005593294 systemd[1]: Reached target Ceph cluster f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:49:23 np0005593294 systemd[1]: Reloading.
Jan 23 04:49:23 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:49:23 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:49:23 np0005593294 systemd[1]: Reloading.
Jan 23 04:49:23 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:49:23 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:49:23 np0005593294 systemd[1]: Created slice Slice /system/ceph-f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:49:23 np0005593294 systemd[1]: Reached target System Time Set.
Jan 23 04:49:23 np0005593294 systemd[1]: Reached target System Time Synchronized.
Jan 23 04:49:23 np0005593294 systemd[1]: Starting Ceph crash.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:49:23 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:24 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:24 np0005593294 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:24 np0005593294 podman[75702]: 2026-01-23 09:49:24.200286878 +0000 UTC m=+0.058757084 container create 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 23 04:49:24 np0005593294 podman[75702]: 2026-01-23 09:49:24.16767125 +0000 UTC m=+0.026141506 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:24 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e5a23edbb65ae3813549cdaf0db86959f39a099db0f015b3bec080e11441f80/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:24 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e5a23edbb65ae3813549cdaf0db86959f39a099db0f015b3bec080e11441f80/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:24 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e5a23edbb65ae3813549cdaf0db86959f39a099db0f015b3bec080e11441f80/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:24 np0005593294 podman[75702]: 2026-01-23 09:49:24.295047906 +0000 UTC m=+0.153518162 container init 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:49:24 np0005593294 podman[75702]: 2026-01-23 09:49:24.300648193 +0000 UTC m=+0.159118409 container start 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:49:24 np0005593294 bash[75702]: 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f
Jan 23 04:49:24 np0005593294 systemd[1]: Started Ceph crash.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:49:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 23 04:49:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.445+0000 7f52cace8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 04:49:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.445+0000 7f52cace8640 -1 AuthRegistry(0x7f52c40698f0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 04:49:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.446+0000 7f52cace8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 04:49:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.446+0000 7f52cace8640 -1 AuthRegistry(0x7f52cace6ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 04:49:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.502+0000 7f52c8a5d640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 04:49:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.502+0000 7f52cace8640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 23 04:49:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 23 04:49:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 23 04:49:25 np0005593294 podman[75824]: 2026-01-23 09:49:25.848166406 +0000 UTC m=+0.076593696 container create f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:49:25 np0005593294 podman[75824]: 2026-01-23 09:49:25.794782493 +0000 UTC m=+0.023209803 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:25 np0005593294 systemd[1]: Started libpod-conmon-f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b.scope.
Jan 23 04:49:25 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:25 np0005593294 podman[75824]: 2026-01-23 09:49:25.949931615 +0000 UTC m=+0.178358935 container init f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325)
Jan 23 04:49:25 np0005593294 podman[75824]: 2026-01-23 09:49:25.960110137 +0000 UTC m=+0.188537447 container start f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 23 04:49:25 np0005593294 musing_roentgen[75841]: 167 167
Jan 23 04:49:25 np0005593294 systemd[1]: libpod-f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b.scope: Deactivated successfully.
Jan 23 04:49:25 np0005593294 podman[75824]: 2026-01-23 09:49:25.979966923 +0000 UTC m=+0.208394233 container attach f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 23 04:49:25 np0005593294 podman[75824]: 2026-01-23 09:49:25.980914192 +0000 UTC m=+0.209341512 container died f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:49:26 np0005593294 systemd[1]: var-lib-containers-storage-overlay-a55b7ab61ea8d62c1744f01b99bdf1124094057c0b58eb7b39dfa65ae663ea2f-merged.mount: Deactivated successfully.
Jan 23 04:49:26 np0005593294 podman[75824]: 2026-01-23 09:49:26.05121978 +0000 UTC m=+0.279647060 container remove f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:49:26 np0005593294 systemd[1]: libpod-conmon-f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b.scope: Deactivated successfully.
Jan 23 04:49:26 np0005593294 podman[75867]: 2026-01-23 09:49:26.273132138 +0000 UTC m=+0.104731124 container create d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 23 04:49:26 np0005593294 podman[75867]: 2026-01-23 09:49:26.193973391 +0000 UTC m=+0.025572397 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:26 np0005593294 systemd[1]: Started libpod-conmon-d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f.scope.
Jan 23 04:49:26 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:26 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:26 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:26 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:26 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:26 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:26 np0005593294 podman[75867]: 2026-01-23 09:49:26.409845279 +0000 UTC m=+0.241444295 container init d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:49:26 np0005593294 podman[75867]: 2026-01-23 09:49:26.420848966 +0000 UTC m=+0.252447952 container start d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:49:26 np0005593294 podman[75867]: 2026-01-23 09:49:26.436594763 +0000 UTC m=+0.268193779 container attach d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default)
Jan 23 04:49:26 np0005593294 trusting_wu[75883]: --> passed data devices: 0 physical, 1 LVM
Jan 23 04:49:26 np0005593294 trusting_wu[75883]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:26 np0005593294 trusting_wu[75883]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:26 np0005593294 trusting_wu[75883]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 92663454-00ec-4b9a-bcda-939cb5c501aa
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Jan 23 04:49:27 np0005593294 lvm[75947]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:49:27 np0005593294 lvm[75947]: VG ceph_vg0 finished
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: stderr: got monmap epoch 1
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: --> Creating keyring file for osd.0
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Jan 23 04:49:27 np0005593294 trusting_wu[75883]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 92663454-00ec-4b9a-bcda-939cb5c501aa --setuser ceph --setgroup ceph
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: stderr: 2026-01-23T09:49:28.055+0000 7fe4104af740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) No valid bdev label found
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: stderr: 2026-01-23T09:49:28.322+0000 7fe4104af740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: --> ceph-volume lvm activate successful for osd ID: 0
Jan 23 04:49:36 np0005593294 trusting_wu[75883]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 23 04:49:36 np0005593294 systemd[1]: libpod-d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f.scope: Deactivated successfully.
Jan 23 04:49:36 np0005593294 systemd[1]: libpod-d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f.scope: Consumed 2.217s CPU time.
Jan 23 04:49:36 np0005593294 podman[75867]: 2026-01-23 09:49:36.641107024 +0000 UTC m=+10.472706040 container died d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:49:36 np0005593294 systemd[1]: var-lib-containers-storage-overlay-cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18-merged.mount: Deactivated successfully.
Jan 23 04:49:37 np0005593294 podman[75867]: 2026-01-23 09:49:37.205300897 +0000 UTC m=+11.036899913 container remove d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:49:37 np0005593294 systemd[1]: libpod-conmon-d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f.scope: Deactivated successfully.
Jan 23 04:49:37 np0005593294 podman[76951]: 2026-01-23 09:49:37.820398455 +0000 UTC m=+0.027695315 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:37 np0005593294 podman[76951]: 2026-01-23 09:49:37.952017996 +0000 UTC m=+0.159314866 container create 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:49:38 np0005593294 systemd[1]: Started libpod-conmon-6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517.scope.
Jan 23 04:49:38 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:38 np0005593294 podman[76951]: 2026-01-23 09:49:38.093916071 +0000 UTC m=+0.301212991 container init 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default)
Jan 23 04:49:38 np0005593294 podman[76951]: 2026-01-23 09:49:38.102677596 +0000 UTC m=+0.309974446 container start 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325)
Jan 23 04:49:38 np0005593294 unruffled_faraday[76967]: 167 167
Jan 23 04:49:38 np0005593294 systemd[1]: libpod-6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517.scope: Deactivated successfully.
Jan 23 04:49:38 np0005593294 podman[76951]: 2026-01-23 09:49:38.116701409 +0000 UTC m=+0.323998269 container attach 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 23 04:49:38 np0005593294 podman[76951]: 2026-01-23 09:49:38.117146153 +0000 UTC m=+0.324443023 container died 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 04:49:38 np0005593294 systemd[1]: var-lib-containers-storage-overlay-6d592c6c6aee16d2ca07c8583a263c323d6f0eff28b30da164c6d4cfb1a74e09-merged.mount: Deactivated successfully.
Jan 23 04:49:38 np0005593294 podman[76951]: 2026-01-23 09:49:38.312566505 +0000 UTC m=+0.519863345 container remove 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:49:38 np0005593294 systemd[1]: libpod-conmon-6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517.scope: Deactivated successfully.
Jan 23 04:49:38 np0005593294 podman[76994]: 2026-01-23 09:49:38.502870607 +0000 UTC m=+0.036292635 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:38 np0005593294 podman[76994]: 2026-01-23 09:49:38.600412114 +0000 UTC m=+0.133834132 container create cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:49:38 np0005593294 systemd[1]: Started libpod-conmon-cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6.scope.
Jan 23 04:49:38 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:38 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:38 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:38 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:38 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:38 np0005593294 podman[76994]: 2026-01-23 09:49:38.738897121 +0000 UTC m=+0.272319149 container init cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 04:49:38 np0005593294 podman[76994]: 2026-01-23 09:49:38.749881087 +0000 UTC m=+0.283303105 container start cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:49:38 np0005593294 podman[76994]: 2026-01-23 09:49:38.873586249 +0000 UTC m=+0.407008267 container attach cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 04:49:39 np0005593294 blissful_elion[77011]: {
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:    "0": [
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:        {
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "devices": [
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "/dev/loop3"
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            ],
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "lv_name": "ceph_lv0",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "lv_size": "21470642176",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PKvnhX-HhRc-31GG-pCfe-6hIl-d5aM-9QV6d5,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=f3005f84-239a-55b6-a948-8f1fb592b920,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=92663454-00ec-4b9a-bcda-939cb5c501aa,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "lv_uuid": "PKvnhX-HhRc-31GG-pCfe-6hIl-d5aM-9QV6d5",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "name": "ceph_lv0",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "tags": {
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.block_uuid": "PKvnhX-HhRc-31GG-pCfe-6hIl-d5aM-9QV6d5",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.cephx_lockbox_secret": "",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.cluster_fsid": "f3005f84-239a-55b6-a948-8f1fb592b920",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.cluster_name": "ceph",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.crush_device_class": "",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.encrypted": "0",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.osd_fsid": "92663454-00ec-4b9a-bcda-939cb5c501aa",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.osd_id": "0",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.type": "block",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.vdo": "0",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:                "ceph.with_tpm": "0"
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            },
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "type": "block",
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:            "vg_name": "ceph_vg0"
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:        }
Jan 23 04:49:39 np0005593294 blissful_elion[77011]:    ]
Jan 23 04:49:39 np0005593294 blissful_elion[77011]: }
Jan 23 04:49:39 np0005593294 systemd[1]: libpod-cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6.scope: Deactivated successfully.
Jan 23 04:49:39 np0005593294 podman[77020]: 2026-01-23 09:49:39.090871301 +0000 UTC m=+0.025945809 container died cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 23 04:49:39 np0005593294 systemd[1]: var-lib-containers-storage-overlay-d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93-merged.mount: Deactivated successfully.
Jan 23 04:49:39 np0005593294 podman[77020]: 2026-01-23 09:49:39.373684889 +0000 UTC m=+0.308759387 container remove cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:49:39 np0005593294 systemd[1]: libpod-conmon-cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6.scope: Deactivated successfully.
Jan 23 04:49:40 np0005593294 podman[77126]: 2026-01-23 09:49:39.988484419 +0000 UTC m=+0.025025531 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:40 np0005593294 podman[77126]: 2026-01-23 09:49:40.165292914 +0000 UTC m=+0.201834026 container create f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:49:40 np0005593294 systemd[1]: Started libpod-conmon-f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3.scope.
Jan 23 04:49:40 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:40 np0005593294 podman[77126]: 2026-01-23 09:49:40.339452706 +0000 UTC m=+0.375993858 container init f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:49:40 np0005593294 podman[77126]: 2026-01-23 09:49:40.345859688 +0000 UTC m=+0.382400790 container start f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default)
Jan 23 04:49:40 np0005593294 elated_golick[77143]: 167 167
Jan 23 04:49:40 np0005593294 systemd[1]: libpod-f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3.scope: Deactivated successfully.
Jan 23 04:49:40 np0005593294 podman[77126]: 2026-01-23 09:49:40.362154172 +0000 UTC m=+0.398695354 container attach f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:49:40 np0005593294 podman[77126]: 2026-01-23 09:49:40.36272551 +0000 UTC m=+0.399266652 container died f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Jan 23 04:49:40 np0005593294 systemd[1]: var-lib-containers-storage-overlay-ac93b8b960b88f01538dfc377c79ac43fbe7d2360e229169fbea32c69e52f84f-merged.mount: Deactivated successfully.
Jan 23 04:49:40 np0005593294 podman[77126]: 2026-01-23 09:49:40.454263527 +0000 UTC m=+0.490804669 container remove f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 23 04:49:40 np0005593294 systemd[1]: libpod-conmon-f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3.scope: Deactivated successfully.
Jan 23 04:49:40 np0005593294 podman[77174]: 2026-01-23 09:49:40.758664807 +0000 UTC m=+0.047016175 container create 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:49:40 np0005593294 podman[77174]: 2026-01-23 09:49:40.734104651 +0000 UTC m=+0.022455999 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:40 np0005593294 systemd[1]: Started libpod-conmon-9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8.scope.
Jan 23 04:49:40 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:40 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:40 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:40 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:40 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:40 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:40 np0005593294 podman[77174]: 2026-01-23 09:49:40.900921123 +0000 UTC m=+0.189272481 container init 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 23 04:49:40 np0005593294 podman[77174]: 2026-01-23 09:49:40.913876071 +0000 UTC m=+0.202227399 container start 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:49:41 np0005593294 podman[77174]: 2026-01-23 09:49:41.017720546 +0000 UTC m=+0.306071904 container attach 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:49:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test[77190]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Jan 23 04:49:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test[77190]:                            [--no-systemd] [--no-tmpfs]
Jan 23 04:49:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test[77190]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 23 04:49:41 np0005593294 systemd[1]: libpod-9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8.scope: Deactivated successfully.
Jan 23 04:49:41 np0005593294 podman[77174]: 2026-01-23 09:49:41.117646767 +0000 UTC m=+0.405998125 container died 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:49:41 np0005593294 systemd[1]: var-lib-containers-storage-overlay-ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579-merged.mount: Deactivated successfully.
Jan 23 04:49:41 np0005593294 podman[77174]: 2026-01-23 09:49:41.513351696 +0000 UTC m=+0.801703034 container remove 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:49:41 np0005593294 systemd[1]: libpod-conmon-9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8.scope: Deactivated successfully.
Jan 23 04:49:42 np0005593294 systemd[1]: Reloading.
Jan 23 04:49:42 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:49:42 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:49:42 np0005593294 systemd[1]: Reloading.
Jan 23 04:49:42 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:49:42 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:49:42 np0005593294 systemd[1]: Starting Ceph osd.0 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:49:42 np0005593294 podman[77350]: 2026-01-23 09:49:42.751904036 +0000 UTC m=+0.050207855 container create 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 04:49:42 np0005593294 podman[77350]: 2026-01-23 09:49:42.728123915 +0000 UTC m=+0.026427894 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:42 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:42 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:42 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:42 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:42 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:42 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:42 np0005593294 podman[77350]: 2026-01-23 09:49:42.976001672 +0000 UTC m=+0.274305581 container init 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:49:42 np0005593294 podman[77350]: 2026-01-23 09:49:42.987240867 +0000 UTC m=+0.285544696 container start 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 23 04:49:43 np0005593294 podman[77350]: 2026-01-23 09:49:43.052226196 +0000 UTC m=+0.350530015 container attach 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1)
Jan 23 04:49:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:43 np0005593294 bash[77350]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:43 np0005593294 bash[77350]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:43 np0005593294 lvm[77446]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:49:43 np0005593294 lvm[77446]: VG ceph_vg0 finished
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:44 np0005593294 bash[77350]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 23 04:49:44 np0005593294 bash[77350]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:44 np0005593294 bash[77350]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 04:49:44 np0005593294 bash[77350]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Jan 23 04:49:44 np0005593294 bash[77350]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:44 np0005593294 bash[77350]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:44 np0005593294 bash[77350]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:49:44 np0005593294 bash[77350]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 04:49:44 np0005593294 bash[77350]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 04:49:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: --> ceph-volume lvm activate successful for osd ID: 0
Jan 23 04:49:44 np0005593294 bash[77350]: --> ceph-volume lvm activate successful for osd ID: 0
Jan 23 04:49:44 np0005593294 systemd[1]: libpod-64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f.scope: Deactivated successfully.
Jan 23 04:49:44 np0005593294 systemd[1]: libpod-64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f.scope: Consumed 1.750s CPU time.
Jan 23 04:49:44 np0005593294 podman[77540]: 2026-01-23 09:49:44.591243962 +0000 UTC m=+0.032728834 container died 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:49:44 np0005593294 systemd[1]: var-lib-containers-storage-overlay-17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5-merged.mount: Deactivated successfully.
Jan 23 04:49:44 np0005593294 podman[77540]: 2026-01-23 09:49:44.792376645 +0000 UTC m=+0.233861437 container remove 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Jan 23 04:49:45 np0005593294 podman[77599]: 2026-01-23 09:49:45.123119985 +0000 UTC m=+0.118171028 container create 70bc56e6e481c715160044ab59ecb88ebb44c3388a9f7a7a34bc220e894d037b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:49:45 np0005593294 podman[77599]: 2026-01-23 09:49:45.038714543 +0000 UTC m=+0.033765586 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:45 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:45 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:45 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:45 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:45 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:45 np0005593294 podman[77599]: 2026-01-23 09:49:45.205747841 +0000 UTC m=+0.200798944 container init 70bc56e6e481c715160044ab59ecb88ebb44c3388a9f7a7a34bc220e894d037b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:49:45 np0005593294 podman[77599]: 2026-01-23 09:49:45.210691976 +0000 UTC m=+0.205743019 container start 70bc56e6e481c715160044ab59ecb88ebb44c3388a9f7a7a34bc220e894d037b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 23 04:49:45 np0005593294 bash[77599]: 70bc56e6e481c715160044ab59ecb88ebb44c3388a9f7a7a34bc220e894d037b
Jan 23 04:49:45 np0005593294 systemd[1]: Started Ceph osd.0 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: pidfile_write: ignore empty --pid-file
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Jan 23 04:49:45 np0005593294 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: load: jerasure load: lrc 
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:46 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount shared_bdev_used = 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Git sha 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: DB SUMMARY
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: DB Session ID:  KYUCMB80H616SE245L90
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                                     Options.env: 0x55a55f711dc0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                                Options.info_log: 0x55a55f7157a0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.write_buffer_manager: 0x55a55f80aa00
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.row_cache: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                              Options.wal_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.wal_compression: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.max_background_jobs: 4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Compression algorithms supported:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kZSTD supported: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e9309b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e9309b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e9309b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: edb5d1d6-d8de-4399-98b8-c0de0b841c0c
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161787309052, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161787309291, "job": 1, "event": "recovery_finished"}
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: freelist init
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: freelist _read_cfg
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs umount
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluefs mount shared_bdev_used = 4718592
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Git sha 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: DB SUMMARY
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: DB Session ID:  KYUCMB80H616SE245L91
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                                     Options.env: 0x55a55f8ae9a0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                                Options.info_log: 0x55a55f715940
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.write_buffer_manager: 0x55a55f80aa00
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.row_cache: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                              Options.wal_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.wal_compression: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.max_background_jobs: 4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Compression algorithms supported:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kZSTD supported: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e931350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e9309b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e9309b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55a55e9309b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: edb5d1d6-d8de-4399-98b8-c0de0b841c0c
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161787583393, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 04:49:47 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 04:49:48 np0005593294 podman[78123]: 2026-01-23 09:49:47.917729703 +0000 UTC m=+0.024168300 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161791589979, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161787, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "edb5d1d6-d8de-4399-98b8-c0de0b841c0c", "db_session_id": "KYUCMB80H616SE245L91", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:49:51 np0005593294 podman[78123]: 2026-01-23 09:49:51.626736577 +0000 UTC m=+3.733175154 container create 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True)
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161791636078, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161791, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "edb5d1d6-d8de-4399-98b8-c0de0b841c0c", "db_session_id": "KYUCMB80H616SE245L91", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161791639730, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161791, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "edb5d1d6-d8de-4399-98b8-c0de0b841c0c", "db_session_id": "KYUCMB80H616SE245L91", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161791642107, "job": 1, "event": "recovery_finished"}
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 23 04:49:51 np0005593294 systemd[1]: Started libpod-conmon-8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6.scope.
Jan 23 04:49:51 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:51 np0005593294 podman[78123]: 2026-01-23 09:49:51.739075591 +0000 UTC m=+3.845514178 container init 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2)
Jan 23 04:49:51 np0005593294 podman[78123]: 2026-01-23 09:49:51.746608152 +0000 UTC m=+3.853046709 container start 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:49:51 np0005593294 pensive_albattani[78141]: 167 167
Jan 23 04:49:51 np0005593294 systemd[1]: libpod-8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6.scope: Deactivated successfully.
Jan 23 04:49:51 np0005593294 conmon[78141]: conmon 8e52773d66f0d45762a8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6.scope/container/memory.events
Jan 23 04:49:51 np0005593294 podman[78123]: 2026-01-23 09:49:51.77340427 +0000 UTC m=+3.879842937 container attach 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:49:51 np0005593294 podman[78123]: 2026-01-23 09:49:51.774423436 +0000 UTC m=+3.880862003 container died 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55a55f912000
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: rocksdb: DB pointer 0x55a55f8bc000
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4.3 total, 4.3 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4.3 total, 4.3 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4.3 total, 4.3 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4.3 total, 4.3 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usag
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: _get_class not permitted to load lua
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: _get_class not permitted to load sdk
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: osd.0 0 load_pgs
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: osd.0 0 load_pgs opened 0 pgs
Jan 23 04:49:51 np0005593294 ceph-osd[77616]: osd.0 0 log_to_monitors true
Jan 23 04:49:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0[77612]: 2026-01-23T09:49:51.826+0000 7f80f7b74740 -1 osd.0 0 log_to_monitors true
Jan 23 04:49:51 np0005593294 systemd[1]: var-lib-containers-storage-overlay-655734e702f595c2054344cff8329887765228ffdd492f8b3ccc4b124e06f8eb-merged.mount: Deactivated successfully.
Jan 23 04:49:51 np0005593294 podman[78123]: 2026-01-23 09:49:51.903477169 +0000 UTC m=+4.009915776 container remove 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:49:51 np0005593294 systemd[1]: libpod-conmon-8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6.scope: Deactivated successfully.
Jan 23 04:49:52 np0005593294 podman[78198]: 2026-01-23 09:49:52.080401251 +0000 UTC m=+0.071998027 container create 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:49:52 np0005593294 systemd[1]: Started libpod-conmon-9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7.scope.
Jan 23 04:49:52 np0005593294 podman[78198]: 2026-01-23 09:49:52.0316169 +0000 UTC m=+0.023213656 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:52 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:52 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:52 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:52 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:52 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:52 np0005593294 podman[78198]: 2026-01-23 09:49:52.166286967 +0000 UTC m=+0.157883793 container init 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 23 04:49:52 np0005593294 podman[78198]: 2026-01-23 09:49:52.178898134 +0000 UTC m=+0.170494910 container start 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 23 04:49:52 np0005593294 podman[78198]: 2026-01-23 09:49:52.2399259 +0000 UTC m=+0.231522636 container attach 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:49:52 np0005593294 lvm[78288]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:49:52 np0005593294 lvm[78288]: VG ceph_vg0 finished
Jan 23 04:49:52 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 23 04:49:52 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 23 04:49:52 np0005593294 loving_knuth[78214]: {}
Jan 23 04:49:52 np0005593294 systemd[1]: libpod-9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7.scope: Deactivated successfully.
Jan 23 04:49:52 np0005593294 podman[78198]: 2026-01-23 09:49:52.88597129 +0000 UTC m=+0.877568026 container died 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:49:52 np0005593294 systemd[1]: libpod-9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7.scope: Consumed 1.150s CPU time.
Jan 23 04:49:52 np0005593294 systemd[1]: var-lib-containers-storage-overlay-bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791-merged.mount: Deactivated successfully.
Jan 23 04:49:52 np0005593294 podman[78198]: 2026-01-23 09:49:52.932085047 +0000 UTC m=+0.923681773 container remove 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:49:52 np0005593294 ceph-osd[77616]: osd.0 0 done with init, starting boot process
Jan 23 04:49:52 np0005593294 ceph-osd[77616]: osd.0 0 start_boot
Jan 23 04:49:52 np0005593294 ceph-osd[77616]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 23 04:49:52 np0005593294 ceph-osd[77616]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 23 04:49:52 np0005593294 ceph-osd[77616]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 23 04:49:52 np0005593294 ceph-osd[77616]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 23 04:49:52 np0005593294 ceph-osd[77616]: osd.0 0  bench count 12288000 bsize 4 KiB
Jan 23 04:49:52 np0005593294 systemd[1]: libpod-conmon-9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7.scope: Deactivated successfully.
Jan 23 04:49:54 np0005593294 podman[78450]: 2026-01-23 09:49:54.262455525 +0000 UTC m=+0.408605672 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:49:54 np0005593294 podman[78470]: 2026-01-23 09:49:54.490716957 +0000 UTC m=+0.110803692 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Jan 23 04:49:54 np0005593294 podman[78450]: 2026-01-23 09:49:54.62933951 +0000 UTC m=+0.775489697 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 23 04:49:56 np0005593294 podman[78671]: 2026-01-23 09:49:56.386100666 +0000 UTC m=+0.117839785 container create 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 23 04:49:56 np0005593294 podman[78671]: 2026-01-23 09:49:56.296702838 +0000 UTC m=+0.028441987 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:56 np0005593294 systemd[1]: Started libpod-conmon-3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a.scope.
Jan 23 04:49:56 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:56 np0005593294 podman[78671]: 2026-01-23 09:49:56.898996691 +0000 UTC m=+0.630735810 container init 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 23 04:49:56 np0005593294 podman[78671]: 2026-01-23 09:49:56.905699503 +0000 UTC m=+0.637438642 container start 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 04:49:56 np0005593294 modest_murdock[78687]: 167 167
Jan 23 04:49:56 np0005593294 systemd[1]: libpod-3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a.scope: Deactivated successfully.
Jan 23 04:49:57 np0005593294 podman[78671]: 2026-01-23 09:49:57.046175891 +0000 UTC m=+0.777915020 container attach 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 04:49:57 np0005593294 podman[78671]: 2026-01-23 09:49:57.047186466 +0000 UTC m=+0.778925575 container died 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:49:57 np0005593294 systemd[1]: var-lib-containers-storage-overlay-f9aca6e6255188e68cd156b8f3472d2ad8076950b3211f23582ffdbcd86fd0c1-merged.mount: Deactivated successfully.
Jan 23 04:49:57 np0005593294 podman[78671]: 2026-01-23 09:49:57.831485018 +0000 UTC m=+1.563224137 container remove 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:49:57 np0005593294 systemd[1]: libpod-conmon-3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a.scope: Deactivated successfully.
Jan 23 04:49:58 np0005593294 podman[78712]: 2026-01-23 09:49:58.03610559 +0000 UTC m=+0.045973403 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:49:58 np0005593294 podman[78712]: 2026-01-23 09:49:58.214743492 +0000 UTC m=+0.224611265 container create d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:49:58 np0005593294 systemd[1]: Started libpod-conmon-d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310.scope.
Jan 23 04:49:58 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:49:58 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:58 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:58 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:58 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:58 np0005593294 podman[78712]: 2026-01-23 09:49:58.704849877 +0000 UTC m=+0.714717670 container init d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 04:49:58 np0005593294 podman[78712]: 2026-01-23 09:49:58.712874056 +0000 UTC m=+0.722741859 container start d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 04:49:58 np0005593294 podman[78712]: 2026-01-23 09:49:58.931813804 +0000 UTC m=+0.941681657 container attach d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:49:59 np0005593294 condescending_jang[78728]: [
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:    {
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        "available": false,
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        "being_replaced": false,
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        "ceph_device_lvm": false,
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        "lsm_data": {},
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        "lvs": [],
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        "path": "/dev/sr0",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        "rejected_reasons": [
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "Has a FileSystem",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "Insufficient space (<5GB)"
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        ],
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        "sys_api": {
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "actuators": null,
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "device_nodes": [
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:                "sr0"
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            ],
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "devname": "sr0",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "human_readable_size": "482.00 KB",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "id_bus": "ata",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "model": "QEMU DVD-ROM",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "nr_requests": "2",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "parent": "/dev/sr0",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "partitions": {},
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "path": "/dev/sr0",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "removable": "1",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "rev": "2.5+",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "ro": "0",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "rotational": "1",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "sas_address": "",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "sas_device_handle": "",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "scheduler_mode": "mq-deadline",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "sectors": 0,
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "sectorsize": "2048",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "size": 493568.0,
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "support_discard": "2048",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "type": "disk",
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:            "vendor": "QEMU"
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:        }
Jan 23 04:49:59 np0005593294 condescending_jang[78728]:    }
Jan 23 04:49:59 np0005593294 condescending_jang[78728]: ]
Jan 23 04:49:59 np0005593294 systemd[1]: libpod-d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310.scope: Deactivated successfully.
Jan 23 04:49:59 np0005593294 podman[78712]: 2026-01-23 09:49:59.439328583 +0000 UTC m=+1.449196376 container died d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:49:59 np0005593294 systemd[1]: var-lib-containers-storage-overlay-8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb-merged.mount: Deactivated successfully.
Jan 23 04:49:59 np0005593294 podman[78712]: 2026-01-23 09:49:59.909778887 +0000 UTC m=+1.919646660 container remove d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:50:00 np0005593294 systemd[1]: libpod-conmon-d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310.scope: Deactivated successfully.
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 13.396 iops: 3429.483 elapsed_sec: 0.875
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: log_channel(cluster) log [WRN] : OSD bench result of 3429.482546 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: osd.0 0 waiting for initial osdmap
Jan 23 04:50:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0[77612]: 2026-01-23T09:50:03.888+0000 7f80f430a640 -1 osd.0 0 waiting for initial osdmap
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: osd.0 16 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: osd.0 16 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: osd.0 16 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: osd.0 16 check_osdmap_features require_osd_release unknown -> squid
Jan 23 04:50:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0[77612]: 2026-01-23T09:50:03.929+0000 7f80ef11f640 -1 osd.0 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: osd.0 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: osd.0 16 set_numa_affinity not setting numa affinity
Jan 23 04:50:03 np0005593294 ceph-osd[77616]: osd.0 16 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Jan 23 04:50:04 np0005593294 ceph-osd[77616]: osd.0 17 state: booting -> active
Jan 23 04:50:04 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 17 pg[1.0( empty local-lis/les=0/0 n=0 ec=11/11 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 pi=[11,17)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:04 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 17 pg[2.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 pi=[13,17)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:05 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 18 pg[7.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:05 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 18 pg[2.0( empty local-lis/les=17/18 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 pi=[13,17)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:05 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 18 pg[1.0( empty local-lis/les=17/18 n=0 ec=11/11 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 pi=[11,17)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:06 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 19 pg[7.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 24 pg[2.0( empty local-lis/les=17/18 n=0 ec=13/13 lis/c=17/17 les/c/f=18/18/0 sis=24 pruub=9.220177650s) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 30.096221924s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 24 pg[2.0( empty local-lis/les=17/18 n=0 ec=13/13 lis/c=17/17 les/c/f=18/18/0 sis=24 pruub=9.220177650s) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown pruub 30.096221924s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1f( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1c( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1d( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1b( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1e( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.8( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.7( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.9( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.a( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.6( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.4( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.2( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.5( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.3( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.b( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.c( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.e( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.d( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.f( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.10( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.11( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.12( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.13( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.14( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.15( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.17( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.18( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.19( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.16( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1a( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.7( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.8( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.9( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.a( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.4( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.6( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.2( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.3( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.0( empty local-lis/les=24/25 n=0 ec=13/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.11( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.14( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.13( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.15( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.17( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.10( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.19( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1a( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.16( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Jan 23 04:50:13 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Jan 23 04:50:14 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Jan 23 04:50:14 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Jan 23 04:50:15 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Jan 23 04:50:15 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Jan 23 04:50:16 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 27 pg[7.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=27 pruub=14.500118256s) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active pruub 38.990123749s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:16 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 27 pg[7.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=27 pruub=14.500118256s) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown pruub 38.990123749s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:16 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Jan 23 04:50:16 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1f( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1d( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1c( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.12( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.13( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.11( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.10( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.16( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.17( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.14( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.15( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.a( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.8( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.9( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.b( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.e( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.6( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.5( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.4( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.7( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.3( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.2( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.d( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.c( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.f( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1e( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.18( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.19( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1b( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1a( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1d( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.12( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1c( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.10( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.14( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.17( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.a( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.8( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.13( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.9( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.0( empty local-lis/les=27/28 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.6( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.4( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.7( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.3( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.2( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.d( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.c( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.18( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.19( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1a( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.15( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.7 deep-scrub starts
Jan 23 04:50:17 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.7 deep-scrub ok
Jan 23 04:50:19 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Jan 23 04:50:19 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Jan 23 04:50:20 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Jan 23 04:50:20 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Jan 23 04:50:21 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Jan 23 04:50:21 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Jan 23 04:50:22 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.a scrub starts
Jan 23 04:50:22 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.a scrub ok
Jan 23 04:50:23 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Jan 23 04:50:23 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Jan 23 04:50:24 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.0 deep-scrub starts
Jan 23 04:50:24 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.0 deep-scrub ok
Jan 23 04:50:25 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Jan 23 04:50:25 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Jan 23 04:50:26 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Jan 23 04:50:26 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.18( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.1a( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.18( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.19( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1a( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.1c( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.1b( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1b( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.1d( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.1a( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1c( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.1a( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.c( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.e( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.e( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.d( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.e( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.f( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.9( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.1( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.3( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.2( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.5( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.3( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.5( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.2( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.7( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.5( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.4( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.7( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.d( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.a( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.a( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.8( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.d( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.c( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.9( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.8( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.a( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.f( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.e( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.9( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.11( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.16( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.10( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.15( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.13( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.15( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.15( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.17( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.15( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.13( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.14( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.11( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.12( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.16( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.10( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1f( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.1c( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.1f( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.1e( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.19( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569741249s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228610992s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.19( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569710732s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228610992s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1d( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927922249s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.586837769s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.15( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569553375s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228507996s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1d( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927888870s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.586837769s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.15( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569536209s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228507996s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.10( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927852631s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.586887360s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.13( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927933693s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.586959839s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.10( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927838326s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.586887360s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.13( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927897453s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.586959839s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.13( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569301605s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228446960s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.13( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569286346s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228446960s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.14( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927904129s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587131500s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.14( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927886963s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587131500s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.10( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569243431s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228557587s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.a( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927868843s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587200165s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.a( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927852631s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587200165s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568997383s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228363037s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.10( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569215775s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228557587s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568982124s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228363037s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927917480s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587375641s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568883896s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228378296s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927903175s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587375641s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568869591s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228378296s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.8( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927607536s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587207794s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568799019s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228431702s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568782806s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228431702s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.8( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927577972s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587207794s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.9( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927671432s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587364197s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.9( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927655220s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587364197s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927598953s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587387085s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927581787s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587387085s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.6( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927831650s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587646484s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.6( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927813530s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587646484s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.4( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927702904s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587638855s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568323135s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228282928s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568302155s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228282928s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.4( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568067551s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228092194s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.6( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568223000s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228286743s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.4( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568034172s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228092194s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.6( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568204880s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228286743s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.3( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927593231s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587711334s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.3( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927577019s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587711334s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.2( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927405357s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587703705s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.a( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567815781s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228160858s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.9( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567931175s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228275299s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.a( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567797661s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228160858s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.9( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567910194s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228275299s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.2( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927258492s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587703705s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927349091s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587882996s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927308083s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587882996s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927147865s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587757111s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567054749s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.227695465s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567035675s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.227695465s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927106857s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587757111s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927084923s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587886810s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.566875458s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.227695465s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927068710s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587886810s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.4( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.926847458s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587638855s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.566858292s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.227695465s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.566905975s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.227867126s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.566858292s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.227867126s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.18( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.926668167s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587890625s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:50:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.18( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.926606178s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587890625s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.18( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.19( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.18( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.1a( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.1c( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.1b( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1b( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1c( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.1a( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.d( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.c( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.e( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.f( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.3( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.2( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.5( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.2( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.5( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.3( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.5( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.7( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.7( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.8( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.d( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.a( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.c( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.9( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.a( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.15( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.10( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.13( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.f( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.16( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.15( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.13( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.11( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.14( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1f( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.16( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.10( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:28 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:50:29 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Jan 23 04:50:29 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Jan 23 04:50:30 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Jan 23 04:50:30 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Jan 23 04:50:31 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Jan 23 04:50:31 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Jan 23 04:50:32 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Jan 23 04:50:32 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Jan 23 04:50:33 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Jan 23 04:50:33 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Jan 23 04:50:34 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.11 deep-scrub starts
Jan 23 04:50:34 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.11 deep-scrub ok
Jan 23 04:50:35 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Jan 23 04:50:35 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Jan 23 04:50:36 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Jan 23 04:50:36 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Jan 23 04:50:37 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.17 deep-scrub starts
Jan 23 04:50:37 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.17 deep-scrub ok
Jan 23 04:50:38 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Jan 23 04:50:38 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Jan 23 04:50:39 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Jan 23 04:50:39 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Jan 23 04:50:39 np0005593294 python3[79783]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:50:40 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.f scrub starts
Jan 23 04:50:40 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.f scrub ok
Jan 23 04:50:41 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.b scrub starts
Jan 23 04:50:41 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.b scrub ok
Jan 23 04:50:42 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Jan 23 04:50:42 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Jan 23 04:50:43 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.5 deep-scrub starts
Jan 23 04:50:43 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.5 deep-scrub ok
Jan 23 04:50:44 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.0 scrub starts
Jan 23 04:50:44 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.0 scrub ok
Jan 23 04:50:45 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Jan 23 04:50:45 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Jan 23 04:50:46 np0005593294 podman[79888]: 2026-01-23 09:50:46.112578396 +0000 UTC m=+0.038624331 container create 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 04:50:46 np0005593294 systemd[72579]: Starting Mark boot as successful...
Jan 23 04:50:46 np0005593294 systemd[1]: Started libpod-conmon-61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20.scope.
Jan 23 04:50:46 np0005593294 systemd[72579]: Finished Mark boot as successful.
Jan 23 04:50:46 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:50:46 np0005593294 podman[79888]: 2026-01-23 09:50:46.172433799 +0000 UTC m=+0.098479774 container init 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:50:46 np0005593294 podman[79888]: 2026-01-23 09:50:46.180005861 +0000 UTC m=+0.106051796 container start 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 04:50:46 np0005593294 podman[79888]: 2026-01-23 09:50:46.183467392 +0000 UTC m=+0.109513387 container attach 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:50:46 np0005593294 determined_cray[79905]: 167 167
Jan 23 04:50:46 np0005593294 systemd[1]: libpod-61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20.scope: Deactivated successfully.
Jan 23 04:50:46 np0005593294 conmon[79905]: conmon 61ed6a2a4e3de4ff6fb9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20.scope/container/memory.events
Jan 23 04:50:46 np0005593294 podman[79888]: 2026-01-23 09:50:46.186915621 +0000 UTC m=+0.112961576 container died 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Jan 23 04:50:46 np0005593294 podman[79888]: 2026-01-23 09:50:46.096222758 +0000 UTC m=+0.022268713 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:46 np0005593294 systemd[1]: var-lib-containers-storage-overlay-ba6b10ec9127ad71c246b54227d169f244d1e8b67c2aa1dd44fe8281e0da0aba-merged.mount: Deactivated successfully.
Jan 23 04:50:46 np0005593294 podman[79888]: 2026-01-23 09:50:46.225187748 +0000 UTC m=+0.151233703 container remove 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 23 04:50:46 np0005593294 systemd[1]: libpod-conmon-61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20.scope: Deactivated successfully.
Jan 23 04:50:46 np0005593294 podman[79921]: 2026-01-23 09:50:46.30084892 +0000 UTC m=+0.052746809 container create ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:50:46 np0005593294 systemd[1]: Started libpod-conmon-ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f.scope.
Jan 23 04:50:46 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:50:46 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:46 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:46 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:46 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:46 np0005593294 podman[79921]: 2026-01-23 09:50:46.275401678 +0000 UTC m=+0.027299657 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:46 np0005593294 podman[79921]: 2026-01-23 09:50:46.386468078 +0000 UTC m=+0.138366057 container init ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Jan 23 04:50:46 np0005593294 podman[79921]: 2026-01-23 09:50:46.396051179 +0000 UTC m=+0.147949078 container start ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 23 04:50:46 np0005593294 podman[79921]: 2026-01-23 09:50:46.400223394 +0000 UTC m=+0.152121323 container attach ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:50:46 np0005593294 systemd[1]: libpod-ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f.scope: Deactivated successfully.
Jan 23 04:50:46 np0005593294 podman[79921]: 2026-01-23 09:50:46.488765743 +0000 UTC m=+0.240663742 container died ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:50:46 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Jan 23 04:50:46 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Jan 23 04:50:46 np0005593294 systemd[1]: var-lib-containers-storage-overlay-9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323-merged.mount: Deactivated successfully.
Jan 23 04:50:46 np0005593294 podman[79921]: 2026-01-23 09:50:46.540752215 +0000 UTC m=+0.292650104 container remove ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 23 04:50:46 np0005593294 systemd[1]: libpod-conmon-ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f.scope: Deactivated successfully.
Jan 23 04:50:46 np0005593294 systemd[1]: Reloading.
Jan 23 04:50:46 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:46 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:46 np0005593294 systemd[1]: Reloading.
Jan 23 04:50:46 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:46 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:47 np0005593294 systemd[1]: Starting Ceph mon.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:50:47 np0005593294 podman[80106]: 2026-01-23 09:50:47.306073598 +0000 UTC m=+0.043630782 container create c1579b7599b2e571635ed3fae7a7ef35d7f3ef624019dac27d31923c7bd1f747 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-1, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:50:47 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e7173b5235c451bebb9606b15241deab18031289ccd1627f117cd452a9f7db/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:47 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e7173b5235c451bebb9606b15241deab18031289ccd1627f117cd452a9f7db/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:47 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e7173b5235c451bebb9606b15241deab18031289ccd1627f117cd452a9f7db/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:47 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e7173b5235c451bebb9606b15241deab18031289ccd1627f117cd452a9f7db/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:47 np0005593294 podman[80106]: 2026-01-23 09:50:47.368408119 +0000 UTC m=+0.105965313 container init c1579b7599b2e571635ed3fae7a7ef35d7f3ef624019dac27d31923c7bd1f747 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:50:47 np0005593294 podman[80106]: 2026-01-23 09:50:47.375588147 +0000 UTC m=+0.113145321 container start c1579b7599b2e571635ed3fae7a7ef35d7f3ef624019dac27d31923c7bd1f747 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:50:47 np0005593294 bash[80106]: c1579b7599b2e571635ed3fae7a7ef35d7f3ef624019dac27d31923c7bd1f747
Jan 23 04:50:47 np0005593294 podman[80106]: 2026-01-23 09:50:47.286600924 +0000 UTC m=+0.024158138 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:47 np0005593294 systemd[1]: Started Ceph mon.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: pidfile_write: ignore empty --pid-file
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: load: jerasure load: lrc 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Git sha 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: DB SUMMARY
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: DB Session ID:  PH7FUS34ITA44089QBF9
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 636 ; 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                                     Options.env: 0x563e77fecc20
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                                Options.info_log: 0x563e79a7ba20
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                                 Options.wal_dir: 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                    Options.write_buffer_manager: 0x563e79a7f900
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                               Options.row_cache: None
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                              Options.wal_filter: None
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.wal_compression: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.max_background_jobs: 2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.max_total_wal_size: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:       Options.compaction_readahead_size: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Compression algorithms supported:
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: #011kZSTD supported: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:           Options.merge_operator: 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:        Options.compaction_filter: None
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x563e79a7a5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563e79a9f350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:        Options.write_buffer_size: 33554432
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:  Options.max_write_buffer_number: 2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:          Options.compression: NoCompression
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.num_levels: 7
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1897ab4a-12ed-4850-8782-7d536e06cd96
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161847414070, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161847416344, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1773, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 648, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 526, "raw_average_value_size": 105, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161847416441, "job": 1, "event": "recovery_finished"}
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x563e79aa0e00
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: DB pointer 0x563e79baa000
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.73 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.73 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(???) e0 preinit fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).mds e1 new map
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).mds e1 print_map#012e1#012btime 2026-01-23T09:47:38:565964+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 1 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 1 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e14 e14: 2 total, 1 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e15 e15: 2 total, 1 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e16 e16: 2 total, 1 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e30 e30: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 e31: 2 total, 2 up, 2 in
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 3314933000852226048, adjusting msgr requires
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/1144026165' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/1803776421' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/1803776421' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/2193766018' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/2193766018' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/2528169956' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/2528169956' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/29302298' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/29302298' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/2695482257' entity='client.admin' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Saving service ingress.rgw.default spec with placement count:2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Saving service node-exporter spec with placement *
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Saving service grafana spec with placement compute-0;count:1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Saving service prometheus spec with placement compute-0;count:1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Saving service alertmanager spec with placement compute-0;count:1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/1143624271' entity='client.admin' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/3906855381' entity='client.admin' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/2854364725' entity='client.admin' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Deploying daemon mon.compute-2 on compute-2
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Cluster is now healthy
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/2852887520' entity='client.admin' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-0 calling monitor election
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-2 calling monitor election
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: overall HEALTH_OK
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: Deploying daemon mon.compute-1 on compute-1
Jan 23 04:50:47 np0005593294 ceph-mon[80126]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 23 04:50:47 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Jan 23 04:50:47 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Jan 23 04:50:48 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.d scrub starts
Jan 23 04:50:48 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.d scrub ok
Jan 23 04:50:49 np0005593294 ceph-mon[80126]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Jan 23 04:50:49 np0005593294 ceph-mon[80126]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 23 04:50:49 np0005593294 ceph-mon[80126]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 23 04:50:49 np0005593294 ceph-mon[80126]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:49 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.c scrub starts
Jan 23 04:50:49 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.c scrub ok
Jan 23 04:50:50 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1a deep-scrub starts
Jan 23 04:50:50 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1a deep-scrub ok
Jan 23 04:50:51 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Jan 23 04:50:51 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Jan 23 04:50:52 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Jan 23 04:50:52 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Jan 23 04:50:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 23 04:50:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Jan 23 04:50:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:52 np0005593294 ceph-mon[80126]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Jan 23 04:50:53 np0005593294 ceph-mon[80126]: mon.compute-0 calling monitor election
Jan 23 04:50:53 np0005593294 ceph-mon[80126]: mon.compute-2 calling monitor election
Jan 23 04:50:53 np0005593294 ceph-mon[80126]: mon.compute-1 calling monitor election
Jan 23 04:50:53 np0005593294 ceph-mon[80126]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 04:50:53 np0005593294 ceph-mon[80126]: overall HEALTH_OK
Jan 23 04:50:53 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:53 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Jan 23 04:50:53 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Jan 23 04:50:54 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:54 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:54 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.uczrot", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:50:54 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.uczrot", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 04:50:54 np0005593294 ceph-mon[80126]: Deploying daemon mgr.compute-2.uczrot on compute-2
Jan 23 04:50:54 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/4282911488' entity='client.admin' 
Jan 23 04:50:54 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.18 deep-scrub starts
Jan 23 04:50:54 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.18 deep-scrub ok
Jan 23 04:50:54 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 23 04:50:54 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 23 04:50:55 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Jan 23 04:50:55 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.19 scrub ok
Jan 23 04:50:55 np0005593294 podman[80254]: 2026-01-23 09:50:55.717584531 +0000 UTC m=+0.049702039 container create 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 23 04:50:55 np0005593294 systemd[1]: Started libpod-conmon-94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a.scope.
Jan 23 04:50:55 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:50:55 np0005593294 podman[80254]: 2026-01-23 09:50:55.696244045 +0000 UTC m=+0.028361603 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:55 np0005593294 podman[80254]: 2026-01-23 09:50:55.794692789 +0000 UTC m=+0.126810307 container init 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 23 04:50:55 np0005593294 podman[80254]: 2026-01-23 09:50:55.803043327 +0000 UTC m=+0.135160835 container start 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 23 04:50:55 np0005593294 ecstatic_pascal[80270]: 167 167
Jan 23 04:50:55 np0005593294 systemd[1]: libpod-94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a.scope: Deactivated successfully.
Jan 23 04:50:55 np0005593294 podman[80254]: 2026-01-23 09:50:55.811859181 +0000 UTC m=+0.143976699 container attach 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:50:55 np0005593294 podman[80254]: 2026-01-23 09:50:55.81277315 +0000 UTC m=+0.144890658 container died 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 04:50:55 np0005593294 systemd[1]: var-lib-containers-storage-overlay-b42a0ebdf80131d105ff0e320c043f62051da928516dd336d1618a6dabec38a4-merged.mount: Deactivated successfully.
Jan 23 04:50:55 np0005593294 podman[80254]: 2026-01-23 09:50:55.854371217 +0000 UTC m=+0.186488715 container remove 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True)
Jan 23 04:50:55 np0005593294 systemd[1]: libpod-conmon-94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a.scope: Deactivated successfully.
Jan 23 04:50:55 np0005593294 systemd[1]: Reloading.
Jan 23 04:50:55 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:55 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:56 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:56 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:56 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/3189222711' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 23 04:50:56 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:56 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.jmakme", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:50:56 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.jmakme", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 04:50:56 np0005593294 ceph-mon[80126]: Deploying daemon mgr.compute-1.jmakme on compute-1
Jan 23 04:50:56 np0005593294 systemd[1]: Reloading.
Jan 23 04:50:56 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:56 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:56 np0005593294 systemd[1]: Starting Ceph mgr.compute-1.jmakme for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:50:56 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Jan 23 04:50:56 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Jan 23 04:50:56 np0005593294 podman[80413]: 2026-01-23 09:50:56.681372266 +0000 UTC m=+0.056997643 container create c38fbb9e0518ef0565602b46527dbad670c096b87a85b029bc0b62fffaa07da4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True)
Jan 23 04:50:56 np0005593294 podman[80413]: 2026-01-23 09:50:56.648147108 +0000 UTC m=+0.023772555 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:56 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7277cbb2c7757e589b80f719d5ef942d0603f5cfc3d63ff0dec5284e7172e2e5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:56 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7277cbb2c7757e589b80f719d5ef942d0603f5cfc3d63ff0dec5284e7172e2e5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:56 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7277cbb2c7757e589b80f719d5ef942d0603f5cfc3d63ff0dec5284e7172e2e5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:56 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7277cbb2c7757e589b80f719d5ef942d0603f5cfc3d63ff0dec5284e7172e2e5/merged/var/lib/ceph/mgr/ceph-compute-1.jmakme supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:56 np0005593294 podman[80413]: 2026-01-23 09:50:56.766970587 +0000 UTC m=+0.142596034 container init c38fbb9e0518ef0565602b46527dbad670c096b87a85b029bc0b62fffaa07da4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 04:50:56 np0005593294 podman[80413]: 2026-01-23 09:50:56.784001615 +0000 UTC m=+0.159627032 container start c38fbb9e0518ef0565602b46527dbad670c096b87a85b029bc0b62fffaa07da4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:50:56 np0005593294 bash[80413]: c38fbb9e0518ef0565602b46527dbad670c096b87a85b029bc0b62fffaa07da4
Jan 23 04:50:56 np0005593294 systemd[1]: Started Ceph mgr.compute-1.jmakme for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:50:56 np0005593294 ceph-mgr[80432]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:50:56 np0005593294 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:50:56 np0005593294 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 04:50:56 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 04:50:56 np0005593294 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:50:56 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 04:50:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:56.988+0000 7fcb174f0140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:50:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:57.081+0000 7fcb174f0140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:50:57 np0005593294 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:50:57 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/3189222711' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/1618362368' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 23 04:50:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1019916908 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:50:57 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1a scrub starts
Jan 23 04:50:57 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1a scrub ok
Jan 23 04:50:57 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 04:50:57 np0005593294 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:50:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:57.942+0000 7fcb174f0140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:50:57 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 04:50:58 np0005593294 ceph-mon[80126]: Deploying daemon crash.compute-2 on compute-2
Jan 23 04:50:58 np0005593294 systemd[1]: session-30.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd[1]: session-28.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd[1]: session-31.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd[1]: session-32.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd[1]: session-32.scope: Consumed 1min 14.985s CPU time.
Jan 23 04:50:58 np0005593294 systemd[1]: session-23.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd[1]: session-24.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd[1]: session-20.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd[1]: session-26.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 30 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 23 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd[1]: session-29.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd[1]: session-27.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 28 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 32 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 31 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 26 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 24 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 20 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 27 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 29 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd[1]: session-22.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd[1]: session-25.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 22 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Session 25 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 30.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 28.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 31.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 32.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 23.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 24.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 20.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 26.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 29.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 27.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 22.
Jan 23 04:50:58 np0005593294 systemd-logind[807]: Removed session 25.
Jan 23 04:50:58 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1a deep-scrub starts
Jan 23 04:50:58 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1a deep-scrub ok
Jan 23 04:50:58 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:50:58 np0005593294 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:50:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:58.689+0000 7fcb174f0140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:50:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:50:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:  from numpy import show_config as show_numpy_config
Jan 23 04:50:58 np0005593294 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 04:50:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:58.897+0000 7fcb174f0140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593294 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 04:50:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:58.990+0000 7fcb174f0140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:50:59 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 04:50:59 np0005593294 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:50:59 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:50:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:59.166+0000 7fcb174f0140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:50:59 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/1618362368' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 23 04:50:59 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Jan 23 04:50:59 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Jan 23 04:50:59 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 04:50:59 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:50:59 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:51:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.296+0000 7fcb174f0140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:51:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.512+0000 7fcb174f0140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 04:51:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.606+0000 7fcb174f0140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:51:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.682+0000 7fcb174f0140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 04:51:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.767+0000 7fcb174f0140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 04:51:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.842+0000 7fcb174f0140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593294 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:51:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:01.210+0000 7fcb174f0140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593294 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 04:51:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:01.307+0000 7fcb174f0140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Jan 23 04:51:01 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 04:51:01 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Jan 23 04:51:01 np0005593294 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 04:51:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:01.747+0000 7fcb174f0140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 04:51:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.299+0000 7fcb174f0140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:51:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.374+0000 7fcb174f0140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020052916 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 04:51:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.455+0000 7fcb174f0140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Jan 23 04:51:02 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 04:51:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.626+0000 7fcb174f0140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 04:51:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.698+0000 7fcb174f0140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:02 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:51:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.848+0000 7fcb174f0140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 04:51:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.054+0000 7fcb174f0140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 04:51:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.319+0000 7fcb174f0140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.387+0000 7fcb174f0140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x560ee3ff8d00 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  1: '-n'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  2: 'mgr.compute-1.jmakme'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  3: '-f'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  4: '--setuser'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  5: 'ceph'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  6: '--setgroup'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  7: 'ceph'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr respawn  exe_path /proc/self/exe
Jan 23 04:51:03 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.d scrub starts
Jan 23 04:51:03 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.d scrub ok
Jan 23 04:51:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setuser ceph since I am not root
Jan 23 04:51:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setgroup ceph since I am not root
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 04:51:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.646+0000 7f5dcc91b140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 04:51:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.738+0000 7f5dcc91b140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:04 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 04:51:04 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1c deep-scrub starts
Jan 23 04:51:04 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1c deep-scrub ok
Jan 23 04:51:04 np0005593294 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:04.525+0000 7f5dcc91b140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:04 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:05.161+0000 7f5dcc91b140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:51:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:51:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:51:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:  from numpy import show_config as show_numpy_config
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:05.324+0000 7f5dcc91b140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:05.395+0000 7f5dcc91b140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 04:51:05 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.e scrub starts
Jan 23 04:51:05 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.e scrub ok
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:05.541+0000 7f5dcc91b140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 04:51:05 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 e32: 2 total, 2 up, 2 in
Jan 23 04:51:05 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 04:51:06 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.c deep-scrub starts
Jan 23 04:51:06 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.c deep-scrub ok
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:51:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:06.540+0000 7f5dcc91b140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:06 np0005593294 systemd-logind[807]: New session 33 of user ceph-admin.
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:51:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:06.767+0000 7f5dcc91b140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:06 np0005593294 systemd[1]: Started Session 33 of User ceph-admin.
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:06.851+0000 7f5dcc91b140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:06 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:51:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:06.919+0000 7f5dcc91b140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 04:51:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.003+0000 7f5dcc91b140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 04:51:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.076+0000 7f5dcc91b140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-mon[80126]: Active manager daemon compute-0.nbdygh restarted
Jan 23 04:51:07 np0005593294 ceph-mon[80126]: Activating manager daemon compute-0.nbdygh
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:51:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.428+0000 7f5dcc91b140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 _set_new_cache_sizes cache_size:1020054705 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:07 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.e scrub starts
Jan 23 04:51:07 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.e scrub ok
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.526+0000 7f5dcc91b140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 04:51:07 np0005593294 podman[80621]: 2026-01-23 09:51:07.607599594 +0000 UTC m=+0.070749796 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:07 np0005593294 podman[80621]: 2026-01-23 09:51:07.704037882 +0000 UTC m=+0.167188114 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 04:51:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.940+0000 7f5dcc91b140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.f scrub starts
Jan 23 04:51:08 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.f scrub ok
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.483+0000 7f5dcc91b140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.563+0000 7f5dcc91b140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:51:08 np0005593294 ceph-mon[80126]: Manager daemon compute-0.nbdygh is now available
Jan 23 04:51:08 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 04:51:08 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 04:51:08 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 04:51:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.646+0000 7f5dcc91b140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.798+0000 7f5dcc91b140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.875+0000 7f5dcc91b140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:09.046+0000 7f5dcc91b140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 04:51:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:09.275+0000 7f5dcc91b140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.e scrub starts
Jan 23 04:51:09 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.e scrub ok
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 04:51:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:09.535+0000 7f5dcc91b140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:09.613+0000 7f5dcc91b140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: mgr load Constructed class from module: dashboard
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x564cc1832d00 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: [dashboard INFO root] Starting engine...
Jan 23 04:51:09 np0005593294 ceph-mgr[80432]: [dashboard INFO root] Engine started...
Jan 23 04:51:09 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Bus STARTING
Jan 23 04:51:09 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Serving on http://192.168.122.100:8765
Jan 23 04:51:09 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Serving on https://192.168.122.100:7150
Jan 23 04:51:09 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Bus STARTED
Jan 23 04:51:09 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Client ('192.168.122.100', 55612) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 04:51:09 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:10 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Jan 23 04:51:10 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Jan 23 04:51:11 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Jan 23 04:51:11 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Jan 23 04:51:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:12 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Jan 23 04:51:12 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: Adjusting osd_memory_target on compute-0 to 127.9M
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: Unable to set osd_memory_target on compute-0 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: Adjusting osd_memory_target on compute-1 to 127.9M
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: Unable to set osd_memory_target on compute-1 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 04:51:13 np0005593294 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:51:13 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Jan 23 04:51:13 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.5 deep-scrub starts
Jan 23 04:51:14 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.5 deep-scrub ok
Jan 23 04:51:15 np0005593294 ceph-mon[80126]: Deploying daemon node-exporter.compute-0 on compute-0
Jan 23 04:51:15 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Jan 23 04:51:15 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Jan 23 04:51:16 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:16 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Jan 23 04:51:16 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Jan 23 04:51:17 np0005593294 systemd[1]: Reloading.
Jan 23 04:51:17 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:51:17 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:51:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:17 np0005593294 systemd[1]: Reloading.
Jan 23 04:51:17 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Jan 23 04:51:17 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Jan 23 04:51:17 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:51:17 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:51:17 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:17 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:17 np0005593294 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:17 np0005593294 ceph-mon[80126]: Deploying daemon node-exporter.compute-1 on compute-1
Jan 23 04:51:17 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/1110789864' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 23 04:51:17 np0005593294 ceph-mgr[80432]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 04:51:17 np0005593294 ceph-mgr[80432]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 04:51:17 np0005593294 ceph-mgr[80432]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 04:51:17 np0005593294 ceph-mgr[80432]: mgr respawn  1: '-n'
Jan 23 04:51:17 np0005593294 ceph-mgr[80432]: mgr respawn  2: 'mgr.compute-1.jmakme'
Jan 23 04:51:17 np0005593294 ceph-mgr[80432]: mgr respawn  3: '-f'
Jan 23 04:51:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setuser ceph since I am not root
Jan 23 04:51:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setgroup ceph since I am not root
Jan 23 04:51:17 np0005593294 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:51:17 np0005593294 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 04:51:17 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 04:51:17 np0005593294 systemd[1]: Starting Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:51:17 np0005593294 systemd-logind[807]: Session 33 logged out. Waiting for processes to exit.
Jan 23 04:51:18 np0005593294 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:18 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 04:51:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:18.035+0000 7f013d384140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:18 np0005593294 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:18.117+0000 7f013d384140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:18 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 04:51:18 np0005593294 bash[81996]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Jan 23 04:51:18 np0005593294 bash[81996]: Getting image source signatures
Jan 23 04:51:18 np0005593294 bash[81996]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Jan 23 04:51:18 np0005593294 bash[81996]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Jan 23 04:51:18 np0005593294 bash[81996]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Jan 23 04:51:18 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Jan 23 04:51:18 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Jan 23 04:51:18 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/1110789864' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 23 04:51:18 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/435334493' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 23 04:51:18 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 04:51:19 np0005593294 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:19.032+0000 7f013d384140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593294 bash[81996]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Jan 23 04:51:19 np0005593294 bash[81996]: Writing manifest to image destination
Jan 23 04:51:19 np0005593294 podman[81996]: 2026-01-23 09:51:19.310044666 +0000 UTC m=+1.139420861 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Jan 23 04:51:19 np0005593294 podman[81996]: 2026-01-23 09:51:19.331257138 +0000 UTC m=+1.160633293 container create 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:51:19 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a9cf7871bcaed94d55bf97d20a317e95aa8ecd54623be987a412d3816ee0ab4/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:19 np0005593294 podman[81996]: 2026-01-23 09:51:19.397726994 +0000 UTC m=+1.227103139 container init 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:51:19 np0005593294 podman[81996]: 2026-01-23 09:51:19.407257301 +0000 UTC m=+1.236633446 container start 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:51:19 np0005593294 bash[81996]: 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6
Jan 23 04:51:19 np0005593294 systemd[1]: Started Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.420Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.420Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.422Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.422Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.422Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.422Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=arp
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=bcache
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=bonding
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=cpu
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=dmi
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=edac
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=entropy
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=filefd
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=hwmon
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=netclass
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=netdev
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=netstat
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=nfs
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=nvme
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=os
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=pressure
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=rapl
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=selinux
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=softnet
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=stat
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=textfile
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=thermal_zone
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=time
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=uname
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=xfs
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=zfs
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.427Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.427Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Jan 23 04:51:19 np0005593294 systemd[1]: session-33.scope: Deactivated successfully.
Jan 23 04:51:19 np0005593294 systemd[1]: session-33.scope: Consumed 6.172s CPU time.
Jan 23 04:51:19 np0005593294 systemd-logind[807]: Removed session 33.
Jan 23 04:51:19 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:51:19 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Jan 23 04:51:19 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Jan 23 04:51:19 np0005593294 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:19.711+0000 7f013d384140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:  from numpy import show_config as show_numpy_config
Jan 23 04:51:19 np0005593294 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:19.882+0000 7f013d384140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 04:51:19 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/435334493' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 23 04:51:19 np0005593294 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 04:51:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:19.954+0000 7f013d384140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:20 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 04:51:20 np0005593294 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:20.098+0000 7f013d384140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:20 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:51:20 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 04:51:20 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:51:20 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.4 deep-scrub starts
Jan 23 04:51:20 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.4 deep-scrub ok
Jan 23 04:51:20 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 04:51:20 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:51:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.149+0000 7f013d384140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.380+0000 7f013d384140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 04:51:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.497+0000 7f013d384140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:51:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.577+0000 7f013d384140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Jan 23 04:51:21 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 04:51:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.659+0000 7f013d384140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 04:51:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.736+0000 7f013d384140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593294 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:51:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:22.083+0000 7f013d384140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593294 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 04:51:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:22.181+0000 7f013d384140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 04:51:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:22 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Jan 23 04:51:22 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Jan 23 04:51:22 np0005593294 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 04:51:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:22.636+0000 7f013d384140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.198+0000 7f013d384140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:51:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.267+0000 7f013d384140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 04:51:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.342+0000 7f013d384140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.488+0000 7f013d384140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 04:51:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.572+0000 7f013d384140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Jan 23 04:51:23 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:51:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.735+0000 7f013d384140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.951+0000 7f013d384140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 04:51:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:24.218+0000 7f013d384140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:24.283+0000 7f013d384140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x559728edd860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  1: '-n'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  2: 'mgr.compute-1.jmakme'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  3: '-f'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  4: '--setuser'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  5: 'ceph'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  6: '--setgroup'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  7: 'ceph'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 04:51:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setuser ceph since I am not root
Jan 23 04:51:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setgroup ceph since I am not root
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 04:51:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:24.525+0000 7fe273115140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 04:51:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:24.618+0000 7fe273115140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 04:51:24 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.a scrub starts
Jan 23 04:51:24 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.a scrub ok
Jan 23 04:51:25 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e33 e33: 2 total, 2 up, 2 in
Jan 23 04:51:25 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 04:51:25 np0005593294 ceph-mon[80126]: Active manager daemon compute-0.nbdygh restarted
Jan 23 04:51:25 np0005593294 ceph-mon[80126]: Activating manager daemon compute-0.nbdygh
Jan 23 04:51:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:25.410+0000 7fe273115140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:25 np0005593294 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:25 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 04:51:25 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Jan 23 04:51:25 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Jan 23 04:51:25 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:51:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:26.012+0000 7fe273115140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:51:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:51:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:51:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:  from numpy import show_config as show_numpy_config
Jan 23 04:51:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:26.161+0000 7fe273115140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 04:51:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:26.231+0000 7fe273115140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 04:51:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:26.381+0000 7fe273115140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:51:26 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.d scrub starts
Jan 23 04:51:26 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.d scrub ok
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 04:51:26 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 04:51:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.335+0000 7fe273115140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:51:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.544+0000 7fe273115140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:51:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.613+0000 7fe273115140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 04:51:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.678+0000 7fe273115140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:51:27 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.c scrub starts
Jan 23 04:51:27 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.c scrub ok
Jan 23 04:51:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.754+0000 7fe273115140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 04:51:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.823+0000 7fe273115140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 04:51:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:28.174+0000 7fe273115140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593294 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:51:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:28.279+0000 7fe273115140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593294 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 04:51:28 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 04:51:28 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.d deep-scrub starts
Jan 23 04:51:28 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.d deep-scrub ok
Jan 23 04:51:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:28.695+0000 7fe273115140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593294 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 04:51:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.247+0000 7fe273115140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 04:51:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.321+0000 7fe273115140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:51:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.411+0000 7fe273115140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 04:51:29 np0005593294 systemd[1]: Stopping User Manager for UID 42477...
Jan 23 04:51:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.573+0000 7fe273115140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 systemd[72579]: Activating special unit Exit the Session...
Jan 23 04:51:29 np0005593294 systemd[72579]: Stopped target Main User Target.
Jan 23 04:51:29 np0005593294 systemd[72579]: Stopped target Basic System.
Jan 23 04:51:29 np0005593294 systemd[72579]: Stopped target Paths.
Jan 23 04:51:29 np0005593294 systemd[72579]: Stopped target Sockets.
Jan 23 04:51:29 np0005593294 systemd[72579]: Stopped target Timers.
Jan 23 04:51:29 np0005593294 systemd[72579]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:51:29 np0005593294 systemd[72579]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 04:51:29 np0005593294 systemd[72579]: Closed D-Bus User Message Bus Socket.
Jan 23 04:51:29 np0005593294 systemd[72579]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:51:29 np0005593294 systemd[72579]: Removed slice User Application Slice.
Jan 23 04:51:29 np0005593294 systemd[72579]: Reached target Shutdown.
Jan 23 04:51:29 np0005593294 systemd[72579]: Finished Exit the Session.
Jan 23 04:51:29 np0005593294 systemd[72579]: Reached target Exit the Session.
Jan 23 04:51:29 np0005593294 systemd[1]: user@42477.service: Deactivated successfully.
Jan 23 04:51:29 np0005593294 systemd[1]: Stopped User Manager for UID 42477.
Jan 23 04:51:29 np0005593294 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Jan 23 04:51:29 np0005593294 systemd[1]: run-user-42477.mount: Deactivated successfully.
Jan 23 04:51:29 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Jan 23 04:51:29 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Jan 23 04:51:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.658+0000 7fe273115140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 04:51:29 np0005593294 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Jan 23 04:51:29 np0005593294 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Jan 23 04:51:29 np0005593294 systemd[1]: Removed slice User Slice of UID 42477.
Jan 23 04:51:29 np0005593294 systemd[1]: user-42477.slice: Consumed 1min 22.559s CPU time.
Jan 23 04:51:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.832+0000 7fe273115140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:51:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:30.064+0000 7fe273115140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 04:51:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:30.367+0000 7fe273115140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 04:51:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:30.439+0000 7fe273115140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: mgr load Constructed class from module: dashboard
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: [dashboard INFO root] Starting engine...
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x5616c891d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 04:51:30 np0005593294 ceph-mgr[80432]: [dashboard INFO root] Engine started...
Jan 23 04:51:30 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Jan 23 04:51:30 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Jan 23 04:51:31 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.a deep-scrub starts
Jan 23 04:51:31 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.a deep-scrub ok
Jan 23 04:51:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:32 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.a scrub starts
Jan 23 04:51:32 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.a scrub ok
Jan 23 04:51:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e34 e34: 2 total, 2 up, 2 in
Jan 23 04:51:33 np0005593294 ceph-mon[80126]: Active manager daemon compute-0.nbdygh restarted
Jan 23 04:51:33 np0005593294 ceph-mon[80126]: Activating manager daemon compute-0.nbdygh
Jan 23 04:51:33 np0005593294 ceph-mon[80126]: Manager daemon compute-0.nbdygh is now available
Jan 23 04:51:33 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 04:51:33 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 04:51:33 np0005593294 systemd-logind[807]: New session 34 of user ceph-admin.
Jan 23 04:51:33 np0005593294 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 04:51:33 np0005593294 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 04:51:33 np0005593294 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 04:51:33 np0005593294 systemd[1]: Starting User Manager for UID 42477...
Jan 23 04:51:33 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.e deep-scrub starts
Jan 23 04:51:33 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.e deep-scrub ok
Jan 23 04:51:33 np0005593294 systemd[82140]: Queued start job for default target Main User Target.
Jan 23 04:51:33 np0005593294 systemd[82140]: Created slice User Application Slice.
Jan 23 04:51:33 np0005593294 systemd[82140]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:51:33 np0005593294 systemd[82140]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:51:33 np0005593294 systemd[82140]: Reached target Paths.
Jan 23 04:51:33 np0005593294 systemd[82140]: Reached target Timers.
Jan 23 04:51:33 np0005593294 systemd[82140]: Starting D-Bus User Message Bus Socket...
Jan 23 04:51:33 np0005593294 systemd[82140]: Starting Create User's Volatile Files and Directories...
Jan 23 04:51:33 np0005593294 systemd[82140]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:51:33 np0005593294 systemd[82140]: Reached target Sockets.
Jan 23 04:51:33 np0005593294 systemd[82140]: Finished Create User's Volatile Files and Directories.
Jan 23 04:51:33 np0005593294 systemd[82140]: Reached target Basic System.
Jan 23 04:51:33 np0005593294 systemd[82140]: Reached target Main User Target.
Jan 23 04:51:33 np0005593294 systemd[82140]: Startup finished in 155ms.
Jan 23 04:51:33 np0005593294 systemd[1]: Started User Manager for UID 42477.
Jan 23 04:51:33 np0005593294 systemd[1]: Started Session 34 of User ceph-admin.
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e2 new map
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e2 print_map#012e2#012btime 2026-01-23T09:51:34:000852+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:51:34.000760+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e35 e35: 2 total, 2 up, 2 in
Jan 23 04:51:34 np0005593294 podman[82275]: 2026-01-23 09:51:34.55739482 +0000 UTC m=+0.101723451 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 23 04:51:34 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:34 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Jan 23 04:51:34 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Jan 23 04:51:34 np0005593294 podman[82275]: 2026-01-23 09:51:34.684437733 +0000 UTC m=+0.228766324 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:51:35 np0005593294 podman[82396]: 2026-01-23 09:51:35.220751629 +0000 UTC m=+0.061746475 container exec 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:51:35 np0005593294 podman[82396]: 2026-01-23 09:51:35.231876857 +0000 UTC m=+0.072871673 container exec_died 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Bus STARTING
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Serving on http://192.168.122.100:8765
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Serving on https://192.168.122.100:7150
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Bus STARTED
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Client ('192.168.122.100', 48072) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Jan 23 04:51:35 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Jan 23 04:51:36 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Jan 23 04:51:36 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Jan 23 04:51:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e36 e36: 2 total, 2 up, 2 in
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:37 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.f scrub starts
Jan 23 04:51:37 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.f scrub ok
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Jan 23 04:51:37 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Jan 23 04:51:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e37 e37: 2 total, 2 up, 2 in
Jan 23 04:51:38 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Jan 23 04:51:38 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:39 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e38 e38: 2 total, 2 up, 2 in
Jan 23 04:51:39 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Jan 23 04:51:39 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Jan 23 04:51:40 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Jan 23 04:51:41 np0005593294 ceph-mon[80126]: Deploying daemon node-exporter.compute-2 on compute-2
Jan 23 04:51:41 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Jan 23 04:51:41 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Jan 23 04:51:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:42 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Jan 23 04:51:42 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Jan 23 04:51:43 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/992291970' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 23 04:51:43 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Jan 23 04:51:43 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Jan 23 04:51:44 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1f deep-scrub starts
Jan 23 04:51:44 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1f deep-scrub ok
Jan 23 04:51:44 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/992291970' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 23 04:51:44 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:45 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Jan 23 04:51:45 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Jan 23 04:51:46 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:46 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:46 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:46 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:51:46 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:51:46 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1e deep-scrub starts
Jan 23 04:51:46 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1e deep-scrub ok
Jan 23 04:51:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:47 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Jan 23 04:51:47 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Jan 23 04:51:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e39 e39: 3 total, 2 up, 3 in
Jan 23 04:51:48 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.102:0/1205331151' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 04:51:48 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 04:51:48 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Jan 23 04:51:48 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Jan 23 04:51:49 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]': finished
Jan 23 04:51:49 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:49 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/3560526778' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 23 04:51:49 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Jan 23 04:51:49 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Jan 23 04:51:50 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Jan 23 04:51:50 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Jan 23 04:51:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:53 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:53 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 23 04:51:56 np0005593294 ceph-mon[80126]: Deploying daemon osd.2 on compute-2
Jan 23 04:51:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:02 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:02 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:03 np0005593294 ceph-mon[80126]: from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:52:03 np0005593294 ceph-mon[80126]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:52:04 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e40 e40: 3 total, 2 up, 3 in
Jan 23 04:52:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e41 e41: 3 total, 2 up, 3 in
Jan 23 04:52:07 np0005593294 ceph-mon[80126]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 23 04:52:07 np0005593294 ceph-mon[80126]: from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:52:07 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:07 np0005593294 ceph-mon[80126]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.784867287s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231704712s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.139515877s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 145.586380005s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.158160210s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.605072021s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.784867287s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231704712s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.158160210s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.605072021s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.139515877s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 145.586380005s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157796860s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.605056763s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157771111s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.605056763s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157796860s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.605056763s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157771111s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.605056763s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157330513s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.605087280s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157330513s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.605087280s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156840324s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604675293s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156840324s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604675293s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.141293526s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 145.589202881s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157002449s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604904175s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157002449s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604904175s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156695366s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604660034s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.141293526s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 145.589202881s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.141396523s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 145.589385986s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.141396523s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 145.589385986s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783594131s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231658936s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783594131s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231658936s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156449318s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604644775s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156449318s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604644775s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156120300s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604370117s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156564713s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604827881s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156564713s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604827881s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156120300s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604370117s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783216476s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231521606s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783216476s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231521606s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156695366s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604660034s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155824661s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604232788s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155824661s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604232788s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783000946s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231506348s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783000946s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231506348s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155418396s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603988647s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155418396s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603988647s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.140769005s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 145.589492798s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.140769005s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 145.589492798s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782610893s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231399536s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782610893s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231399536s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155240059s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604156494s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155240059s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604156494s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154916763s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603881836s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154916763s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603881836s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155308723s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604339600s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155308723s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604339600s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154391289s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603485107s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154391289s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603485107s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154301643s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603439331s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154301643s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603439331s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782231331s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231414795s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154086113s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603332520s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782231331s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231414795s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154086113s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603332520s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782031059s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231353760s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:07 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782031059s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231353760s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:08 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:08 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.yzflfx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:08 np0005593294 ceph-mon[80126]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Jan 23 04:52:08 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.yzflfx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:08 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:08 np0005593294 ceph-mon[80126]: Deploying daemon rgw.rgw.compute-2.yzflfx on compute-2
Jan 23 04:52:11 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:11 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:11 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:11 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.syfcuk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:11 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.syfcuk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:11 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:11 np0005593294 podman[83566]: 2026-01-23 09:52:11.158596716 +0000 UTC m=+0.065331549 container create 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:52:11 np0005593294 systemd[1]: Started libpod-conmon-44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a.scope.
Jan 23 04:52:11 np0005593294 podman[83566]: 2026-01-23 09:52:11.136241359 +0000 UTC m=+0.042976292 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:11 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:52:11 np0005593294 podman[83566]: 2026-01-23 09:52:11.252209386 +0000 UTC m=+0.158944329 container init 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 23 04:52:11 np0005593294 podman[83566]: 2026-01-23 09:52:11.260181141 +0000 UTC m=+0.166915984 container start 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:52:11 np0005593294 podman[83566]: 2026-01-23 09:52:11.263781352 +0000 UTC m=+0.170516225 container attach 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 04:52:11 np0005593294 brave_bartik[83582]: 167 167
Jan 23 04:52:11 np0005593294 systemd[1]: libpod-44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a.scope: Deactivated successfully.
Jan 23 04:52:11 np0005593294 podman[83566]: 2026-01-23 09:52:11.269113126 +0000 UTC m=+0.175848029 container died 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 04:52:11 np0005593294 systemd[1]: var-lib-containers-storage-overlay-3b9e42ed369918945c1c16e408fdbf55bf82d3771d57a13416932a0bdaa12bd3-merged.mount: Deactivated successfully.
Jan 23 04:52:11 np0005593294 podman[83566]: 2026-01-23 09:52:11.323616712 +0000 UTC m=+0.230351575 container remove 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325)
Jan 23 04:52:11 np0005593294 systemd[1]: libpod-conmon-44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a.scope: Deactivated successfully.
Jan 23 04:52:11 np0005593294 systemd[1]: Reloading.
Jan 23 04:52:11 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:11 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:11 np0005593294 systemd[1]: Reloading.
Jan 23 04:52:11 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:11 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:11 np0005593294 systemd[1]: Starting Ceph rgw.rgw.compute-1.syfcuk for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:52:12 np0005593294 podman[83723]: 2026-01-23 09:52:12.242450254 +0000 UTC m=+0.095678374 container create f7c6ee44d2d8f0f1344e73a7947d3b52a3fbd5569023e052a4cc0ae65b64d98a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-1-syfcuk, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 04:52:12 np0005593294 podman[83723]: 2026-01-23 09:52:12.17567846 +0000 UTC m=+0.028906630 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:12 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a584fb83515e1aa529260ee1676c377ccbe1795468353602314f453266ca130c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:12 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a584fb83515e1aa529260ee1676c377ccbe1795468353602314f453266ca130c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:12 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a584fb83515e1aa529260ee1676c377ccbe1795468353602314f453266ca130c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:12 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a584fb83515e1aa529260ee1676c377ccbe1795468353602314f453266ca130c/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.syfcuk supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:12 np0005593294 podman[83723]: 2026-01-23 09:52:12.355851973 +0000 UTC m=+0.209080113 container init f7c6ee44d2d8f0f1344e73a7947d3b52a3fbd5569023e052a4cc0ae65b64d98a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-1-syfcuk, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:52:12 np0005593294 podman[83723]: 2026-01-23 09:52:12.362762315 +0000 UTC m=+0.215990435 container start f7c6ee44d2d8f0f1344e73a7947d3b52a3fbd5569023e052a4cc0ae65b64d98a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-1-syfcuk, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:52:12 np0005593294 bash[83723]: f7c6ee44d2d8f0f1344e73a7947d3b52a3fbd5569023e052a4cc0ae65b64d98a
Jan 23 04:52:12 np0005593294 systemd[1]: Started Ceph rgw.rgw.compute-1.syfcuk for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:52:12 np0005593294 radosgw[83743]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:52:12 np0005593294 radosgw[83743]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Jan 23 04:52:12 np0005593294 radosgw[83743]: framework: beast
Jan 23 04:52:12 np0005593294 radosgw[83743]: framework conf key: endpoint, val: 192.168.122.101:8082
Jan 23 04:52:12 np0005593294 radosgw[83743]: init_numa not setting numa affinity
Jan 23 04:52:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e42 e42: 3 total, 2 up, 3 in
Jan 23 04:52:15 np0005593294 ceph-mon[80126]: Deploying daemon rgw.rgw.compute-1.syfcuk on compute-1
Jan 23 04:52:16 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e43 e43: 3 total, 2 up, 3 in
Jan 23 04:52:17 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.102:0/2692084146' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:52:17 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:52:17 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:17 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:17 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 23 04:52:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e44 e44: 3 total, 2 up, 3 in
Jan 23 04:52:19 np0005593294 radosgw[83743]: rgw main: failed to create zone with (17) File exists
Jan 23 04:52:19 np0005593294 ceph-mon[80126]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:52:19 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:19 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jbpfwf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:19 np0005593294 ceph-mon[80126]: OSD bench result of 2077.197482 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 04:52:19 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jbpfwf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 23 04:52:20 np0005593294 radosgw[83743]: rgw main: failed to create zonegroup with (17) File exists
Jan 23 04:52:21 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 23 04:52:21 np0005593294 ceph-mon[80126]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:52:21 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:21 np0005593294 ceph-mon[80126]: Deploying daemon rgw.rgw.compute-0.jbpfwf on compute-0
Jan 23 04:52:21 np0005593294 ceph-mon[80126]: osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776] boot
Jan 23 04:52:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Jan 23 04:52:22 np0005593294 ceph-mon[80126]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[10.0( empty local-lis/les=0/0 n=0 ec=46/46 lis/c=0/0 les/c/f=0/0/0 sis=46) [0] r=0 lpr=46 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 23 04:52:24 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 23 04:52:24 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 04:52:24 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 04:52:24 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:24 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 48 pg[10.0( empty local-lis/les=46/48 n=0 ec=46/46 lis/c=0/0 les/c/f=0/0/0 sis=46) [0] r=0 lpr=46 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:25 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 23 04:52:25 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:25 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:26 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Jan 23 04:52:26 np0005593294 ceph-mon[80126]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.prgzmm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.prgzmm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 23 04:52:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:28 np0005593294 ceph-mon[80126]: Deploying daemon mds.cephfs.compute-2.prgzmm on compute-2
Jan 23 04:52:28 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:52:28 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:52:28 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:52:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 23 04:52:29 np0005593294 ceph-mon[80126]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:52:29 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:30 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:31 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e3 new map
Jan 23 04:52:31 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e3 print_map#012e3#012btime 2026-01-23T09:52:30:834166+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:51:34.000760+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.prgzmm{-1:24193} state up:standby seq 1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:31 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e4 new map
Jan 23 04:52:31 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e4 print_map#012e4#012btime 2026-01-23T09:52:31:070018+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:31.070004+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.prgzmm{0:24193} state up:creating seq 1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Jan 23 04:52:31 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 52 pg[12.0( empty local-lis/les=0/0 n=0 ec=52/52 lis/c=0/0 les/c/f=0/0/0 sis=52) [0] r=0 lpr=52 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:31 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 23 04:52:31 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 53 pg[12.0( empty local-lis/les=52/53 n=0 ec=52/52 lis/c=0/0 les/c/f=0/0/0 sis=52) [0] r=0 lpr=52 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:31 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Jan 23 04:52:31 np0005593294 ceph-mon[80126]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ymknms", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ymknms", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: Deploying daemon mds.cephfs.compute-0.ymknms on compute-0
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: daemon mds.cephfs.compute-2.prgzmm assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: Cluster is now healthy
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: daemon mds.cephfs.compute-2.prgzmm is now active in filesystem cephfs as rank 0
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e5 new map
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e5 print_map#012e5#012btime 2026-01-23T09:52:32:417167+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:32.417165+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Jan 23 04:52:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 23 04:52:33 np0005593294 ceph-mon[80126]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:52:33 np0005593294 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:52:33 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:52:33 np0005593294 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:52:33 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e6 new map
Jan 23 04:52:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e6 print_map#012e6#012btime 2026-01-23T09:52:33:487599+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:32.417165+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:34 np0005593294 radosgw[83743]: v1 topic migration: starting v1 topic migration..
Jan 23 04:52:34 np0005593294 radosgw[83743]: LDAP not started since no server URIs were provided in the configuration.
Jan 23 04:52:34 np0005593294 radosgw[83743]: v1 topic migration: finished v1 topic migration
Jan 23 04:52:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-1-syfcuk[83739]: 2026-01-23T09:52:34.017+0000 7f0b6033c980 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 23 04:52:34 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593294 radosgw[83743]: framework: beast
Jan 23 04:52:34 np0005593294 radosgw[83743]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 23 04:52:34 np0005593294 radosgw[83743]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 23 04:52:34 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593294 radosgw[83743]: starting handler: beast
Jan 23 04:52:34 np0005593294 radosgw[83743]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:52:34 np0005593294 radosgw[83743]: mgrc service_daemon_register rgw.24176 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.syfcuk,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=75d0a494-c738-4cca-b87e-be71cfd0ed45,zone_name=default,zonegroup_id=6635d7c3-d02c-4c4b-90b3-4ee042e293d6,zonegroup_name=default}
Jan 23 04:52:34 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 23 04:52:34 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 23 04:52:35 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:35 np0005593294 ceph-mon[80126]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:52:35 np0005593294 ceph-mon[80126]: Cluster is now healthy
Jan 23 04:52:35 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 23 04:52:35 np0005593294 podman[84454]: 2026-01-23 09:52:35.891646965 +0000 UTC m=+0.087664779 container create a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:52:35 np0005593294 podman[84454]: 2026-01-23 09:52:35.827999266 +0000 UTC m=+0.024017110 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:36 np0005593294 systemd[1]: Started libpod-conmon-a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783.scope.
Jan 23 04:52:36 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:52:36 np0005593294 podman[84454]: 2026-01-23 09:52:36.06575423 +0000 UTC m=+0.261772144 container init a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:52:36 np0005593294 podman[84454]: 2026-01-23 09:52:36.086058683 +0000 UTC m=+0.282076507 container start a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Jan 23 04:52:36 np0005593294 podman[84454]: 2026-01-23 09:52:36.089932743 +0000 UTC m=+0.285950577 container attach a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:52:36 np0005593294 affectionate_hellman[84469]: 167 167
Jan 23 04:52:36 np0005593294 systemd[1]: libpod-a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783.scope: Deactivated successfully.
Jan 23 04:52:36 np0005593294 conmon[84469]: conmon a3b467b128f2927a220b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783.scope/container/memory.events
Jan 23 04:52:36 np0005593294 podman[84454]: 2026-01-23 09:52:36.09668308 +0000 UTC m=+0.292700894 container died a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 23 04:52:36 np0005593294 systemd[1]: var-lib-containers-storage-overlay-c6c21647047ab91f3891dddc234e133b90e1a36c1c8aa506f5557ff081485538-merged.mount: Deactivated successfully.
Jan 23 04:52:36 np0005593294 podman[84454]: 2026-01-23 09:52:36.144043828 +0000 UTC m=+0.340061642 container remove a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:52:36 np0005593294 systemd[1]: libpod-conmon-a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783.scope: Deactivated successfully.
Jan 23 04:52:36 np0005593294 systemd[1]: Reloading.
Jan 23 04:52:36 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:36 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:36 np0005593294 systemd[1]: Reloading.
Jan 23 04:52:36 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 23 04:52:36 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:36 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bcvzvj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:52:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bcvzvj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:52:36 np0005593294 ceph-mon[80126]: Deploying daemon mds.cephfs.compute-1.bcvzvj on compute-1
Jan 23 04:52:36 np0005593294 systemd[1]: Starting Ceph mds.cephfs.compute-1.bcvzvj for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:52:37 np0005593294 podman[84611]: 2026-01-23 09:52:37.081922985 +0000 UTC m=+0.107530709 container create ce7c85a7584e9b88c49c1a58d395d558c61671826c812a66de294289df884d26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-1-bcvzvj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Jan 23 04:52:37 np0005593294 podman[84611]: 2026-01-23 09:52:36.997352784 +0000 UTC m=+0.022960508 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:37 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/561286b795295de8b21b8cbecef711aec0184dced6f9738ecdcf9b77b99ba6e0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:37 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/561286b795295de8b21b8cbecef711aec0184dced6f9738ecdcf9b77b99ba6e0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:37 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/561286b795295de8b21b8cbecef711aec0184dced6f9738ecdcf9b77b99ba6e0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:37 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/561286b795295de8b21b8cbecef711aec0184dced6f9738ecdcf9b77b99ba6e0/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.bcvzvj supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:37 np0005593294 podman[84611]: 2026-01-23 09:52:37.307942117 +0000 UTC m=+0.333549921 container init ce7c85a7584e9b88c49c1a58d395d558c61671826c812a66de294289df884d26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-1-bcvzvj, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 04:52:37 np0005593294 podman[84611]: 2026-01-23 09:52:37.314936432 +0000 UTC m=+0.340544166 container start ce7c85a7584e9b88c49c1a58d395d558c61671826c812a66de294289df884d26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-1-bcvzvj, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:52:37 np0005593294 bash[84611]: ce7c85a7584e9b88c49c1a58d395d558c61671826c812a66de294289df884d26
Jan 23 04:52:37 np0005593294 systemd[1]: Started Ceph mds.cephfs.compute-1.bcvzvj for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:52:37 np0005593294 ceph-mds[84630]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:52:37 np0005593294 ceph-mds[84630]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Jan 23 04:52:37 np0005593294 ceph-mds[84630]: main not setting numa affinity
Jan 23 04:52:37 np0005593294 ceph-mds[84630]: pidfile_write: ignore empty --pid-file
Jan 23 04:52:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-1-bcvzvj[84626]: starting mds.cephfs.compute-1.bcvzvj at 
Jan 23 04:52:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 23 04:52:37 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Updating MDS map to version 6 from mon.2
Jan 23 04:52:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e7 new map
Jan 23 04:52:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e7 print_map#012e7#012btime 2026-01-23T09:52:38:529421+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:32.417165+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:38 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Updating MDS map to version 7 from mon.2
Jan 23 04:52:38 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Monitors have assigned me to become a standby
Jan 23 04:52:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[10.0( v 58'754 (0'0,58'754] local-lis/les=46/48 n=136 ec=46/46 lis/c=46/46 les/c/f=48/48/0 sis=59 pruub=8.448954582s) [0] r=0 lpr=59 pi=[46,59)/1 luod=58'752 crt=58'754 lcod 58'751 mlcod 58'751 active pruub 176.745971680s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[10.0( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=46/46 lis/c=46/46 les/c/f=48/48/0 sis=59 pruub=8.448954582s) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 58'751 mlcod 0'0 unknown pruub 176.745971680s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b85428 space 0x55a560c13ef0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bac208 space 0x55a560c331f0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bad108 space 0x55a560c12760 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd60c8 space 0x55a560c32900 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bac168 space 0x55a560c32eb0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b9cca8 space 0x55a560c329d0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bce848 space 0x55a560c000e0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd6528 space 0x55a560c13bb0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560baca28 space 0x55a560c32d10 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b85068 space 0x55a560c13c80 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b9c3e8 space 0x55a560c32830 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd7c48 space 0x55a560c33600 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bcfc48 space 0x55a560c32f80 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b85ce8 space 0x55a55fde4d10 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bacb68 space 0x55a560c32de0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b9d568 space 0x55a560c32690 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560ba20c8 space 0x55a560a901b0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd7ba8 space 0x55a560c12420 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560c365c8 space 0x55a560c32b70 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd6168 space 0x55a560aa4aa0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd7108 space 0x55a560a64900 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd6b68 space 0x55a560c13a10 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560937c48 space 0x55a560ae3c80 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b85928 space 0x55a560ae3ae0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd72e8 space 0x55a560c12c40 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd5ba8 space 0x55a560c332c0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bac708 space 0x55a560c33120 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bceca8 space 0x55a560a90010 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b9db08 space 0x55a560c32aa0 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bce7a8 space 0x55a560a91c80 0x0~1000 clean)
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.15( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.14( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.17( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.11( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.10( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.10( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.e( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.8( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.f( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.d( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.a( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.6( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.4( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.1b( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.19( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.18( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.12( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.12( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: Creating key for client.nfs.cephfs.0.0.compute-1.bawllm
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e8 new map
Jan 23 04:52:40 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e8 print_map#012e8#012btime 2026-01-23T09:52:40:798611+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:39.805778+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 4 join_fscid=1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:41 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1b( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.7( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.12( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.11( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.10( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1f( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1e( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1d( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1c( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1a( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.19( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.18( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.5( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.4( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.6( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.3( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.b( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.8( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.d( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.9( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.a( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.c( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.e( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.f( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.2( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.13( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.14( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.15( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.16( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.17( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.d( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.14( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.10( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.e( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.17( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.15( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.a( v 44'12 lc 44'8 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.1b( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.4( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.18( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.12( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.6( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=59/60 n=1 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.12( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.10( v 37'1 lc 0'0 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.11( v 44'12 lc 43'1 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.f( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.19( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.7( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.11( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.12( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1e( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1f( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.8( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1c( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1d( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1a( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.19( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.5( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1b( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.4( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.b( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.d( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.3( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.8( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.6( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.a( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.c( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.2( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.0( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=46/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 58'751 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.13( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.15( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.14( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.17( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.e( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.f( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:42 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 04:52:42 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:42 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:42 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:42 np0005593294 podman[84740]: 2026-01-23 09:52:42.050667265 +0000 UTC m=+0.031386247 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:42 np0005593294 podman[84740]: 2026-01-23 09:52:42.296460215 +0000 UTC m=+0.277179157 container create e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:52:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e9 new map
Jan 23 04:52:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).mds e9 print_map#012e9#012btime 2026-01-23T09:52:42:200523+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:39.805778+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 4 join_fscid=1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 3 join_fscid=1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 23 04:52:42 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Updating MDS map to version 9 from mon.2
Jan 23 04:52:42 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 61 pg[12.0( empty local-lis/les=52/53 n=0 ec=52/52 lis/c=52/52 les/c/f=53/53/0 sis=61 pruub=13.239601135s) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active pruub 183.824020386s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:42 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 61 pg[12.0( empty local-lis/les=52/53 n=0 ec=52/52 lis/c=52/52 les/c/f=53/53/0 sis=61 pruub=13.239601135s) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown pruub 183.824020386s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:42 np0005593294 systemd[1]: Started libpod-conmon-e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348.scope.
Jan 23 04:52:42 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.a scrub starts
Jan 23 04:52:42 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:52:42 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.a scrub ok
Jan 23 04:52:42 np0005593294 podman[84740]: 2026-01-23 09:52:42.562659523 +0000 UTC m=+0.543378455 container init e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:52:42 np0005593294 podman[84740]: 2026-01-23 09:52:42.57559554 +0000 UTC m=+0.556314442 container start e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:52:42 np0005593294 podman[84740]: 2026-01-23 09:52:42.580527912 +0000 UTC m=+0.561246844 container attach e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 23 04:52:42 np0005593294 elastic_elbakyan[84756]: 167 167
Jan 23 04:52:42 np0005593294 systemd[1]: libpod-e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348.scope: Deactivated successfully.
Jan 23 04:52:42 np0005593294 podman[84740]: 2026-01-23 09:52:42.585526746 +0000 UTC m=+0.566245678 container died e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 23 04:52:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:42 np0005593294 systemd[1]: var-lib-containers-storage-overlay-f945ce81a4cb406bb8313d4d2c99099af9fd1612f9651a19907dd2f77fde1b38-merged.mount: Deactivated successfully.
Jan 23 04:52:42 np0005593294 podman[84740]: 2026-01-23 09:52:42.778475291 +0000 UTC m=+0.759194193 container remove e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:52:42 np0005593294 systemd[1]: libpod-conmon-e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348.scope: Deactivated successfully.
Jan 23 04:52:43 np0005593294 ceph-mon[80126]: Rados config object exists: conf-nfs.cephfs
Jan 23 04:52:43 np0005593294 ceph-mon[80126]: Creating key for client.nfs.cephfs.0.0.compute-1.bawllm-rgw
Jan 23 04:52:43 np0005593294 ceph-mon[80126]: Bind address in nfs.cephfs.0.0.compute-1.bawllm's ganesha conf is defaulting to empty
Jan 23 04:52:43 np0005593294 ceph-mon[80126]: Deploying daemon nfs.cephfs.0.0.compute-1.bawllm on compute-1
Jan 23 04:52:43 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:43 np0005593294 systemd[1]: Reloading.
Jan 23 04:52:43 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:43 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:43 np0005593294 systemd[1]: Reloading.
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.6 deep-scrub starts
Jan 23 04:52:43 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:43 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.6 deep-scrub ok
Jan 23 04:52:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.10( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.13( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.12( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.15( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.4( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.7( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.6( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.9( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.11( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.8( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.c( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.f( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.b( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.a( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.e( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.d( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.5( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.2( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.3( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1e( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1f( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1c( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1a( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1b( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.18( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.19( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.16( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.17( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.14( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1d( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:43 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.10( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.13( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.15( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.12( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.4( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.7( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.11( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.6( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.8( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.9( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.b( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.d( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.5( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.0( empty local-lis/les=61/62 n=0 ec=52/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.2( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.3( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.f( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1f( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.18( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1b( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.16( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.14( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1d( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.17( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.19( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:44 np0005593294 podman[84897]: 2026-01-23 09:52:44.130299891 +0000 UTC m=+0.075700800 container create 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:52:44 np0005593294 podman[84897]: 2026-01-23 09:52:44.078908509 +0000 UTC m=+0.024309438 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:44 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Jan 23 04:52:44 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Jan 23 04:52:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:44 np0005593294 podman[84897]: 2026-01-23 09:52:44.638376327 +0000 UTC m=+0.583777326 container init 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:52:44 np0005593294 podman[84897]: 2026-01-23 09:52:44.645424895 +0000 UTC m=+0.590825844 container start 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Jan 23 04:52:44 np0005593294 bash[84897]: 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 04:52:44 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:52:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 23 04:52:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:45 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.f scrub starts
Jan 23 04:52:45 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.f scrub ok
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.10 deep-scrub starts
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.10 deep-scrub ok
Jan 23 04:52:46 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.11( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.195859909s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157989502s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.11( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.195804596s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157989502s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.10( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.195192337s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157638550s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.10( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.195178032s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157638550s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.617496490s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.580047607s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.617434502s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.580047607s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.13( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194779396s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157638550s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.13( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194762230s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157638550s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.12( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194700241s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157760620s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.12( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194680214s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157760620s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616744041s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.579940796s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616715431s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.579940796s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616385460s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.580001831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616312027s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'768 lcod 62'767 mlcod 62'767 active pruub 185.580001831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616270065s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'768 lcod 62'767 mlcod 0'0 unknown NOTIFY pruub 185.580001831s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.7( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194222450s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157913208s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.7( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194145203s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157913208s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.6( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194022179s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157989502s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.6( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194001198s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157989502s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.615398407s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 185.579849243s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.615350723s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 185.579849243s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.9( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193523407s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158020020s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193923950s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158660889s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616319656s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.580001831s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193905830s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158660889s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.8( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193208694s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158004761s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.8( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193188667s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158004761s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.9( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193440437s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158020020s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193057060s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158050537s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193034172s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158050537s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.614524841s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 185.579772949s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.b( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193387985s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158676147s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.614482880s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 185.579772949s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.b( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193368912s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158676147s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.614227295s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'768 lcod 62'767 mlcod 62'767 active pruub 185.579681396s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.614196777s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'768 lcod 62'767 mlcod 0'0 unknown NOTIFY pruub 185.579681396s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193170547s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158721924s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193154335s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158721924s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.4( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192028046s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157821655s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613624573s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.579666138s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613601685s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.579666138s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613553047s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'765 lcod 62'764 mlcod 62'764 active pruub 185.579681396s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613502502s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'765 lcod 62'764 mlcod 0'0 unknown NOTIFY pruub 185.579681396s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.2( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192445755s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158721924s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.2( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192432404s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158721924s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613152504s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'766 lcod 62'765 mlcod 62'765 active pruub 185.579589844s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.3( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192131996s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158721924s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613007545s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'766 lcod 62'765 mlcod 0'0 unknown NOTIFY pruub 185.579589844s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.3( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192109108s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158721924s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.4( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191854477s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157821655s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191747665s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158737183s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.612077713s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 185.579498291s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191413879s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158737183s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191987038s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159423828s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191963196s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159423828s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191950798s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159484863s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191933632s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159484863s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611672401s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 185.579452515s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.19( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191948891s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159820557s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.18( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191813469s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159683228s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.19( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191926956s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159820557s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611551285s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 185.579452515s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.18( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191771507s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159683228s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611083031s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'761 lcod 62'760 mlcod 62'760 active pruub 185.579376221s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611630440s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 185.579498291s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611034393s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'761 lcod 62'760 mlcod 0'0 unknown NOTIFY pruub 185.579376221s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.17( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191430092s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159805298s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.17( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191412926s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159805298s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610406876s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.579299927s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610637665s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=60'756 lcod 60'755 mlcod 60'755 active pruub 185.579574585s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610358238s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.579299927s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610615730s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=60'756 lcod 60'755 mlcod 0'0 unknown NOTIFY pruub 185.579574585s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610351562s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'769 lcod 62'768 mlcod 62'768 active pruub 185.579330444s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610065460s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'769 lcod 62'768 mlcod 0'0 unknown NOTIFY pruub 185.579330444s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1d( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.190383911s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159881592s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1d( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.190237999s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159881592s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.14( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.12( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.f( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.4( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.5( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.7( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1a( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1c( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1d( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1b( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1e( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593294 ceph-mon[80126]: Creating key for client.nfs.cephfs.1.0.compute-2.tykohi
Jan 23 04:52:46 np0005593294 ceph-mon[80126]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.d scrub starts
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.d scrub ok
Jan 23 04:52:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=60'756 lcod 60'755 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=60'756 lcod 60'755 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'761 lcod 62'760 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'761 lcod 62'760 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'769 lcod 62'768 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'769 lcod 62'768 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'766 lcod 62'765 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'766 lcod 62'765 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'765 lcod 62'764 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'765 lcod 62'764 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.692672729s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 185.580017090s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.692641258s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 185.580017090s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.692136765s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 185.580047607s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.692108154s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 185.580047607s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691969872s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.579925537s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691805840s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'773 lcod 62'772 mlcod 62'772 active pruub 185.579803467s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691919327s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.579925537s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691760063s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'773 lcod 62'772 mlcod 0'0 unknown NOTIFY pruub 185.579803467s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691308022s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'763 lcod 62'762 mlcod 62'762 active pruub 185.579742432s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691273689s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'763 lcod 62'762 mlcod 0'0 unknown NOTIFY pruub 185.579742432s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690748215s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=61'756 lcod 61'755 mlcod 61'755 active pruub 185.579467773s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690720558s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=61'756 lcod 61'755 mlcod 0'0 unknown NOTIFY pruub 185.579467773s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690556526s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'763 lcod 62'762 mlcod 62'762 active pruub 185.579376221s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690526962s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'763 lcod 62'762 mlcod 0'0 unknown NOTIFY pruub 185.579376221s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690239906s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=61'760 lcod 61'759 mlcod 61'759 active pruub 185.579360962s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690199852s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=61'760 lcod 61'759 mlcod 0'0 unknown NOTIFY pruub 185.579360962s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1a( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1e( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1c( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1b( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1d( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.7( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.5( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.4( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.f( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.12( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.14( v 60'51 lc 50'43 (0'0,60'51] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=60'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 04:52:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:52:47 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:52:47 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 23 04:52:47 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:52:47 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 23 04:52:47 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Jan 23 04:52:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=61'756 lcod 61'755 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=61'756 lcod 61'755 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=61'760 lcod 61'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=61'760 lcod 61'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'773 lcod 62'772 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'773 lcod 62'772 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'765 lcod 62'764 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'769 lcod 62'768 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=64/65 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=64/65 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=64/65 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'766 lcod 62'765 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=64/65 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=60'756 lcod 60'755 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=64/65 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'761 lcod 62'760 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:48 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=64/65 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:49 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 04:52:49 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 04:52:49 np0005593294 ceph-mon[80126]: Rados config object exists: conf-nfs.cephfs
Jan 23 04:52:49 np0005593294 ceph-mon[80126]: Creating key for client.nfs.cephfs.1.0.compute-2.tykohi-rgw
Jan 23 04:52:49 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:49 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:49 np0005593294 ceph-mon[80126]: Bind address in nfs.cephfs.1.0.compute-2.tykohi's ganesha conf is defaulting to empty
Jan 23 04:52:49 np0005593294 ceph-mon[80126]: Deploying daemon nfs.cephfs.1.0.compute-2.tykohi on compute-2
Jan 23 04:52:50 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.873781204s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 193.074768066s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.873719215s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 193.074768066s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.873208046s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'768 lcod 62'767 mlcod 62'767 active pruub 193.074890137s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.873147011s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'768 lcod 62'767 mlcod 0'0 unknown NOTIFY pruub 193.074890137s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.872499466s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 193.074386597s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.872417450s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 193.074386597s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=64/65 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.872388840s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 193.074752808s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=64/65 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.872339249s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.074752808s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.693450928s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 192.895935059s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871794701s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'765 lcod 62'764 mlcod 62'764 active pruub 193.074371338s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.693422318s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 192.895935059s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871710777s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'765 lcod 62'764 mlcod 0'0 unknown NOTIFY pruub 193.074371338s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871749878s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 193.074874878s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871699333s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'766 lcod 65'769 mlcod 65'769 active pruub 193.074935913s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871006012s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'769 lcod 62'768 mlcod 62'768 active pruub 193.074508667s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.870960236s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'769 lcod 62'768 mlcod 0'0 unknown NOTIFY pruub 193.074508667s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871294975s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 193.074935913s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=64/65 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871304512s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=60'756 lcod 60'755 mlcod 60'755 active pruub 193.074981689s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871256828s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 193.074935913s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=64/65 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871284485s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=60'756 lcod 60'755 mlcod 0'0 unknown NOTIFY pruub 193.074981689s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871036530s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'766 lcod 65'769 mlcod 0'0 unknown NOTIFY pruub 193.074935913s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871041298s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 193.074874878s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=65/66 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=58'754 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=65/66 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=65/66 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'773 lcod 62'772 mlcod 0'0 active+remapped mbc={255={(0+1)=9}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=65/66 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 active+remapped mbc={255={(0+1)=6}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=65/66 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=61'756 lcod 61'755 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=65/66 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=61'760 lcod 61'759 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=65/66 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:50 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=65/66 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:52:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:52:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:52:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:51 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:52:51 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 23 04:52:51 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:51 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:51 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:51 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 04:52:51 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 04:52:51 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.107666969s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 194.456970215s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.725880623s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 193.075225830s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.107567787s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.456970215s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.108106613s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 194.457885742s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.107128143s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 194.457000732s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.108001709s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 194.457885742s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.107074738s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 194.457000732s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.725820541s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 193.075225830s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=65/66 n=7 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106925964s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'773 lcod 62'772 mlcod 62'772 active pruub 194.457015991s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=65/66 n=7 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106858253s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'773 lcod 62'772 mlcod 0'0 unknown NOTIFY pruub 194.457015991s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.724931717s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'768 lcod 62'767 mlcod 62'767 active pruub 193.075210571s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.724854469s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'768 lcod 62'767 mlcod 0'0 unknown NOTIFY pruub 193.075210571s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.724407196s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 193.075180054s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.724274635s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 193.075180054s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106864929s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'763 lcod 62'762 mlcod 62'762 active pruub 194.457885742s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106761932s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=61'756 lcod 61'755 mlcod 61'755 active pruub 194.458007812s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.723976135s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 193.075302124s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106699944s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=61'756 lcod 61'755 mlcod 0'0 unknown NOTIFY pruub 194.458007812s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=65/66 n=5 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106610298s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'763 lcod 62'762 mlcod 62'762 active pruub 194.458038330s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106502533s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'763 lcod 62'762 mlcod 0'0 unknown NOTIFY pruub 194.457885742s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.723916054s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 193.075302124s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=65/66 n=5 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106540680s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'763 lcod 62'762 mlcod 0'0 unknown NOTIFY pruub 194.458038330s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.723504066s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'761 lcod 62'760 mlcod 62'760 active pruub 193.075271606s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106259346s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=61'760 lcod 61'759 mlcod 61'759 active pruub 194.458068848s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106206894s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=61'760 lcod 61'759 mlcod 0'0 unknown NOTIFY pruub 194.458068848s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:51 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.723349571s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'761 lcod 62'760 mlcod 0'0 unknown NOTIFY pruub 193.075271606s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:52 np0005593294 ceph-mon[80126]: Creating key for client.nfs.cephfs.2.0.compute-0.fenqiu
Jan 23 04:52:52 np0005593294 ceph-mon[80126]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Jan 23 04:52:52 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 04:52:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 23 04:52:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:54 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:52:54 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 04:52:54 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 04:52:54 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:54 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:54 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.17 deep-scrub starts
Jan 23 04:52:54 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.17 deep-scrub ok
Jan 23 04:52:55 np0005593294 ceph-mon[80126]: Rados config object exists: conf-nfs.cephfs
Jan 23 04:52:55 np0005593294 ceph-mon[80126]: Creating key for client.nfs.cephfs.2.0.compute-0.fenqiu-rgw
Jan 23 04:52:55 np0005593294 ceph-mon[80126]: Bind address in nfs.cephfs.2.0.compute-0.fenqiu's ganesha conf is defaulting to empty
Jan 23 04:52:55 np0005593294 ceph-mon[80126]: Deploying daemon nfs.cephfs.2.0.compute-0.fenqiu on compute-0
Jan 23 04:52:55 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.e scrub starts
Jan 23 04:52:55 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.e scrub ok
Jan 23 04:52:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:56 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:52:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:56 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:52:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:56 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:52:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:56 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:52:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:56 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Jan 23 04:52:56 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Jan 23 04:52:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:57 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.1b scrub starts
Jan 23 04:52:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 23 04:52:57 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.1b scrub ok
Jan 23 04:52:57 np0005593294 ceph-mon[80126]: Deploying daemon haproxy.nfs.cephfs.compute-1.mnxlgm on compute-1
Jan 23 04:52:57 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 23 04:52:58 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.4 scrub starts
Jan 23 04:52:58 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.4 scrub ok
Jan 23 04:52:58 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 23 04:52:59 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.18 scrub starts
Jan 23 04:52:59 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.18 scrub ok
Jan 23 04:52:59 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 23 04:52:59 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 23 04:52:59 np0005593294 podman[85054]: 2026-01-23 09:52:59.950148082 +0000 UTC m=+2.980830786 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 04:53:00 np0005593294 podman[85054]: 2026-01-23 09:53:00.076650693 +0000 UTC m=+3.107333417 container create 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 04:53:00 np0005593294 systemd[1]: Started libpod-conmon-751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb.scope.
Jan 23 04:53:00 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.141506195s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 201.580200195s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.141422272s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 201.580200195s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140943527s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 201.579940796s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140881538s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 201.579940796s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140601158s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 201.579818726s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140559196s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 201.579818726s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140618324s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 201.580184937s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140561104s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 201.580184937s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:00 np0005593294 podman[85054]: 2026-01-23 09:53:00.363974641 +0000 UTC m=+3.394657345 container init 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 04:53:00 np0005593294 podman[85054]: 2026-01-23 09:53:00.37139877 +0000 UTC m=+3.402081454 container start 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 04:53:00 np0005593294 jolly_mendel[85171]: 0 0
Jan 23 04:53:00 np0005593294 systemd[1]: libpod-751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb.scope: Deactivated successfully.
Jan 23 04:53:00 np0005593294 conmon[85171]: conmon 751768b7deabe0750f34 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb.scope/container/memory.events
Jan 23 04:53:00 np0005593294 podman[85054]: 2026-01-23 09:53:00.430442946 +0000 UTC m=+3.461125640 container attach 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 04:53:00 np0005593294 podman[85054]: 2026-01-23 09:53:00.430856867 +0000 UTC m=+3.461539551 container died 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 04:53:00 np0005593294 systemd[1]: var-lib-containers-storage-overlay-7ea2d98c86d410c1127884d606a7dd74545b4518f1c81071a01d562c8a92fdce-merged.mount: Deactivated successfully.
Jan 23 04:53:00 np0005593294 podman[85054]: 2026-01-23 09:53:00.58857233 +0000 UTC m=+3.619255024 container remove 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.12 scrub starts
Jan 23 04:53:00 np0005593294 systemd[1]: libpod-conmon-751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb.scope: Deactivated successfully.
Jan 23 04:53:00 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.12 scrub ok
Jan 23 04:53:00 np0005593294 systemd[1]: Reloading.
Jan 23 04:53:01 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:01 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:01 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 23 04:53:01 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:01 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 23 04:53:01 np0005593294 systemd[1]: Reloading.
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:01 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:01 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:01 np0005593294 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.mnxlgm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.12 scrub starts
Jan 23 04:53:01 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.12 scrub ok
Jan 23 04:53:01 np0005593294 podman[85318]: 2026-01-23 09:53:01.803468227 +0000 UTC m=+0.068990722 container create e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 04:53:01 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33f456f7f8080aaf0dc818d727e8500103e4388c03753b454f47838de1ecfb4a/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 23 04:53:01 np0005593294 podman[85318]: 2026-01-23 09:53:01.75674544 +0000 UTC m=+0.022267955 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 04:53:02 np0005593294 podman[85318]: 2026-01-23 09:53:02.068273654 +0000 UTC m=+0.333796159 container init e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 04:53:02 np0005593294 podman[85318]: 2026-01-23 09:53:02.074144472 +0000 UTC m=+0.339666967 container start e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 04:53:02 np0005593294 bash[85318]: e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f
Jan 23 04:53:02 np0005593294 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.mnxlgm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:53:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [NOTICE] 022/095302 (2) : New worker #1 (4) forked
Jan 23 04:53:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:02 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e71 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:02 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.19 scrub starts
Jan 23 04:53:02 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.19 scrub ok
Jan 23 04:53:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 23 04:53:02 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 72 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] async=[2] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:02 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 72 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=71/72 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] async=[2] r=0 lpr=71 pi=[59,71)/1 crt=62'771 lcod 62'770 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:02 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 72 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] async=[2] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=10}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:02 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 72 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] async=[2] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:03 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:03 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:03 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.322343826s) [2] async=[2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 207.051757812s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.322172165s) [2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 207.051757812s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=71/72 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.321352005s) [2] async=[2] r=-1 lpr=73 pi=[59,73)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 207.051666260s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=71/72 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.321168900s) [2] r=-1 lpr=73 pi=[59,73)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 207.051666260s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.320775986s) [2] async=[2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 72'767 mlcod 72'767 active pruub 207.051757812s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.315814018s) [2] async=[2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 207.047042847s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.315752983s) [2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 207.047042847s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.319846153s) [2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 72'767 mlcod 0'0 unknown NOTIFY pruub 207.051757812s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.8 scrub starts
Jan 23 04:53:03 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.8 scrub ok
Jan 23 04:53:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:04 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8000fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:04 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.15 deep-scrub starts
Jan 23 04:53:04 np0005593294 ceph-mon[80126]: Deploying daemon haproxy.nfs.cephfs.compute-0.yeogal on compute-0
Jan 23 04:53:04 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.15 deep-scrub ok
Jan 23 04:53:04 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 23 04:53:05 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.f scrub starts
Jan 23 04:53:05 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.f scrub ok
Jan 23 04:53:05 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:06 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:06 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.d scrub starts
Jan 23 04:53:06 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.d scrub ok
Jan 23 04:53:07 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.5 scrub starts
Jan 23 04:53:07 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.5 scrub ok
Jan 23 04:53:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:08 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:08 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 23 04:53:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 23 04:53:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 23 04:53:08 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.0 scrub starts
Jan 23 04:53:08 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.0 scrub ok
Jan 23 04:53:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:08 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e4001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:09 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 23 04:53:09 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:09 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:09 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:09 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 23 04:53:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 23 04:53:09 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 77 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=77) [0] r=0 lpr=77 pi=[67,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:09 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 77 pg[10.6( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=77) [0] r=0 lpr=77 pi=[67,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:09 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 77 pg[10.e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=77) [0] r=0 lpr=77 pi=[67,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:09 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 77 pg[10.16( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=77) [0] r=0 lpr=77 pi=[67,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:09 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1f scrub starts
Jan 23 04:53:09 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1f scrub ok
Jan 23 04:53:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:10 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8001c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:10 np0005593294 ceph-mon[80126]: Deploying daemon haproxy.nfs.cephfs.compute-2.bbaqsj on compute-2
Jan 23 04:53:10 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1b deep-scrub starts
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1b deep-scrub ok
Jan 23 04:53:10 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.16( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.6( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.6( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:10 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.16( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:10 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:11 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.16 scrub starts
Jan 23 04:53:11 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.16 scrub ok
Jan 23 04:53:11 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 23 04:53:11 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 23 04:53:11 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 79 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79) [0] r=0 lpr=79 pi=[66,79)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:11 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 79 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=79) [0] r=0 lpr=79 pi=[67,79)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:11 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 79 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79) [0] r=0 lpr=79 pi=[66,79)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:11 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 79 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79) [0] r=0 lpr=79 pi=[66,79)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:12 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.14 scrub starts
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.14 scrub ok
Jan 23 04:53:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:12 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e4001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:12 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 23 04:53:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 luod=0'0 crt=58'754 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=0/0 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 luod=0'0 crt=62'763 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=0/0 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'763 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=0/0 n=5 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 luod=0'0 crt=62'763 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=0/0 n=5 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'763 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[67,80)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[67,80)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:13 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1 deep-scrub starts
Jan 23 04:53:13 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1 deep-scrub ok
Jan 23 04:53:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:14 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8001c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:14 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 23 04:53:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=81 pruub=15.164137840s) [1] r=-1 lpr=81 pi=[59,81)/1 crt=62'761 lcod 62'760 mlcod 62'760 active pruub 217.580154419s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=81 pruub=15.163958549s) [1] r=-1 lpr=81 pi=[59,81)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 217.580169678s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=81 pruub=15.163928986s) [1] r=-1 lpr=81 pi=[59,81)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.580169678s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=81 pruub=15.163748741s) [1] r=-1 lpr=81 pi=[59,81)/1 crt=62'761 lcod 62'760 mlcod 0'0 unknown NOTIFY pruub 217.580154419s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=80/81 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'763 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'763 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=80/81 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=80/81 n=4 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=58'754 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:14 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1a scrub starts
Jan 23 04:53:14 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1a scrub ok
Jan 23 04:53:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:14 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:15 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 23 04:53:15 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=0/0 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=0/0 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] r=0 lpr=82 pi=[59,82)/1 crt=62'761 lcod 62'760 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] r=0 lpr=82 pi=[59,82)/1 crt=62'761 lcod 62'760 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] r=0 lpr=82 pi=[59,82)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] r=0 lpr=82 pi=[59,82)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82) [0] r=0 lpr=82 pi=[67,82)/1 luod=0'0 crt=62'761 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82) [0] r=0 lpr=82 pi=[67,82)/1 crt=62'761 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1e scrub starts
Jan 23 04:53:15 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1e scrub ok
Jan 23 04:53:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:16 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:16 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 23 04:53:16 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=82/83 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:16 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:16 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82) [0] r=0 lpr=82 pi=[67,82)/1 crt=62'761 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:16 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=82/83 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:16 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 04:53:16 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:53:16 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:53:16 np0005593294 ceph-mon[80126]: Deploying daemon keepalived.nfs.cephfs.compute-1.vcrquf on compute-1
Jan 23 04:53:16 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=82/83 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] async=[1] r=0 lpr=82 pi=[59,82)/1 crt=58'754 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:16 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=82/83 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] async=[1] r=0 lpr=82 pi=[59,82)/1 crt=62'761 lcod 62'760 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:16 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:16 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1c scrub starts
Jan 23 04:53:16 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1c scrub ok
Jan 23 04:53:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:16 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 23 04:53:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 84 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=82/83 n=6 ec=59/46 lis/c=82/59 les/c/f=83/60/0 sis=84 pruub=14.970046043s) [1] async=[1] r=-1 lpr=84 pi=[59,84)/1 crt=62'761 lcod 62'760 mlcod 62'760 active pruub 220.446502686s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 84 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=82/83 n=4 ec=59/46 lis/c=82/59 les/c/f=83/60/0 sis=84 pruub=14.969717979s) [1] async=[1] r=-1 lpr=84 pi=[59,84)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 220.446441650s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 84 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=82/83 n=4 ec=59/46 lis/c=82/59 les/c/f=83/60/0 sis=84 pruub=14.969678879s) [1] r=-1 lpr=84 pi=[59,84)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 220.446441650s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:17 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 84 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=82/83 n=6 ec=59/46 lis/c=82/59 les/c/f=83/60/0 sis=84 pruub=14.969963074s) [1] r=-1 lpr=84 pi=[59,84)/1 crt=62'761 lcod 62'760 mlcod 0'0 unknown NOTIFY pruub 220.446502686s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:17 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1b scrub starts
Jan 23 04:53:17 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1b scrub ok
Jan 23 04:53:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:17 np0005593294 podman[85440]: 2026-01-23 09:53:17.933959038 +0000 UTC m=+2.899000893 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 04:53:17 np0005593294 podman[85440]: 2026-01-23 09:53:17.970117897 +0000 UTC m=+2.935159712 container create 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, version=2.2.4, release=1793, name=keepalived, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 23 04:53:18 np0005593294 systemd[1]: Started libpod-conmon-74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611.scope.
Jan 23 04:53:18 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:53:18 np0005593294 podman[85440]: 2026-01-23 09:53:18.071850618 +0000 UTC m=+3.036892513 container init 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, description=keepalived for Ceph, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, name=keepalived, com.redhat.component=keepalived-container)
Jan 23 04:53:18 np0005593294 podman[85440]: 2026-01-23 09:53:18.084278246 +0000 UTC m=+3.049320091 container start 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, io.buildah.version=1.28.2, distribution-scope=public, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, version=2.2.4, name=keepalived, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20)
Jan 23 04:53:18 np0005593294 podman[85440]: 2026-01-23 09:53:18.088482791 +0000 UTC m=+3.053524696 container attach 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, io.buildah.version=1.28.2, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., release=1793, io.openshift.expose-services=, vcs-type=git, com.redhat.component=keepalived-container, description=keepalived for Ceph, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 23 04:53:18 np0005593294 great_franklin[85533]: 0 0
Jan 23 04:53:18 np0005593294 systemd[1]: libpod-74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611.scope: Deactivated successfully.
Jan 23 04:53:18 np0005593294 podman[85440]: 2026-01-23 09:53:18.095211057 +0000 UTC m=+3.060252922 container died 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, release=1793, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, vcs-type=git, com.redhat.component=keepalived-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 23 04:53:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:18 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:18 np0005593294 systemd[1]: var-lib-containers-storage-overlay-7b87cbaa715e1732428b6ea58cc6eec089245e9495a803abc48eaa43aa31956b-merged.mount: Deactivated successfully.
Jan 23 04:53:18 np0005593294 podman[85440]: 2026-01-23 09:53:18.14369586 +0000 UTC m=+3.108737675 container remove 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, distribution-scope=public, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, architecture=x86_64, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9)
Jan 23 04:53:18 np0005593294 systemd[1]: libpod-conmon-74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611.scope: Deactivated successfully.
Jan 23 04:53:18 np0005593294 systemd[1]: Reloading.
Jan 23 04:53:18 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:18 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:18 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:18 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1d deep-scrub starts
Jan 23 04:53:18 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1d deep-scrub ok
Jan 23 04:53:18 np0005593294 systemd[1]: Reloading.
Jan 23 04:53:18 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:18 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:18 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:18 np0005593294 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.vcrquf for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:53:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 23 04:53:19 np0005593294 podman[85675]: 2026-01-23 09:53:19.193468305 +0000 UTC m=+0.075819271 container create 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, name=keepalived, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, description=keepalived for Ceph)
Jan 23 04:53:19 np0005593294 podman[85675]: 2026-01-23 09:53:19.16087056 +0000 UTC m=+0.043221576 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 04:53:19 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09e464e4ed8a605a356fad7fe10c62525e64299965ede90a0c3d729d42259e69/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:53:19 np0005593294 podman[85675]: 2026-01-23 09:53:19.270909147 +0000 UTC m=+0.153260163 container init 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, description=keepalived for Ceph, vendor=Red Hat, Inc., release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, version=2.2.4)
Jan 23 04:53:19 np0005593294 podman[85675]: 2026-01-23 09:53:19.280913798 +0000 UTC m=+0.163264764 container start 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, version=2.2.4, com.redhat.component=keepalived-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, name=keepalived, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2)
Jan 23 04:53:19 np0005593294 bash[85675]: 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af
Jan 23 04:53:19 np0005593294 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.vcrquf for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:53:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 23 04:53:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 23 04:53:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 23 04:53:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 23 04:53:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 23 04:53:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Starting VRRP child process, pid=4
Jan 23 04:53:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Startup complete
Jan 23 04:53:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: (VI_0) Entering BACKUP STATE (init)
Jan 23 04:53:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: VRRP_Script(check_backend) succeeded
Jan 23 04:53:19 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.4 scrub starts
Jan 23 04:53:19 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.4 scrub ok
Jan 23 04:53:19 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 23 04:53:19 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:19 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:19 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:19 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:53:19 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 04:53:19 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:53:19 np0005593294 ceph-mon[80126]: Deploying daemon keepalived.nfs.cephfs.compute-0.lrsdkc on compute-0
Jan 23 04:53:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 23 04:53:19 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 86 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=86) [0] r=0 lpr=86 pi=[66,86)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:19 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 86 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=86) [0] r=0 lpr=86 pi=[67,86)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:20 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:20 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.7 scrub starts
Jan 23 04:53:20 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.7 scrub ok
Jan 23 04:53:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:20 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:20 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:21 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 23 04:53:21 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 23 04:53:21 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] r=-1 lpr=87 pi=[66,87)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:21 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] r=-1 lpr=87 pi=[66,87)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:21 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] r=-1 lpr=87 pi=[67,87)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:21 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] r=-1 lpr=87 pi=[67,87)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:21 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.f deep-scrub starts
Jan 23 04:53:21 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.f deep-scrub ok
Jan 23 04:53:21 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0] r=0 lpr=87 pi=[67,87)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:21 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.1a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0] r=0 lpr=87 pi=[67,87)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:22 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:22 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 23 04:53:22 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 23 04:53:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 23 04:53:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 88 pg[10.1a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=88) [0]/[1] r=-1 lpr=88 pi=[67,88)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 88 pg[10.a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=88) [0]/[1] r=-1 lpr=88 pi=[67,88)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 88 pg[10.a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=88) [0]/[1] r=-1 lpr=88 pi=[67,88)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:22 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 88 pg[10.1a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=88) [0]/[1] r=-1 lpr=88 pi=[67,88)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:22 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:22 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:22 2026: (VI_0) Entering MASTER STATE
Jan 23 04:53:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 luod=0'0 crt=58'754 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89) [0] r=0 lpr=89 pi=[67,89)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89) [0] r=0 lpr=89 pi=[67,89)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.16 scrub starts
Jan 23 04:53:23 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.16 scrub ok
Jan 23 04:53:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=0/0 n=4 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 luod=0'0 crt=61'756 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=0/0 n=7 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 luod=0'0 crt=62'773 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=0/0 n=4 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 crt=61'756 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=0/0 n=7 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 crt=62'773 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=-1 lpr=90 pi=[66,90)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=-1 lpr=90 pi=[66,90)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=-1 lpr=90 pi=[66,90)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=-1 lpr=90 pi=[66,90)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=89/90 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 crt=58'754 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:23 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89) [0] r=0 lpr=89 pi=[67,89)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:24 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:24 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.0 scrub starts
Jan 23 04:53:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:24 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:24 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.0 scrub ok
Jan 23 04:53:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:24 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:25 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.f scrub starts
Jan 23 04:53:25 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 23 04:53:25 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.f scrub ok
Jan 23 04:53:25 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 23 04:53:25 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 91 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=90/91 n=7 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 crt=62'773 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:25 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 91 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=90/91 n=4 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 crt=61'756 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:26 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:26 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.e scrub starts
Jan 23 04:53:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:26 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:26 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.e scrub ok
Jan 23 04:53:26 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:26 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:26 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:26 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:26 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 23 04:53:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 92 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 92 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 92 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=0/0 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 luod=0'0 crt=60'756 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:27 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 92 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=0/0 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 crt=60'756 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:27 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.6 deep-scrub starts
Jan 23 04:53:27 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.6 deep-scrub ok
Jan 23 04:53:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:28 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:28 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1 scrub starts
Jan 23 04:53:28 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1 scrub ok
Jan 23 04:53:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:28 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:28 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e400a3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:29 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 23 04:53:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:29 2026: (VI_0) Entering BACKUP STATE
Jan 23 04:53:29 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.12 deep-scrub starts
Jan 23 04:53:29 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.12 deep-scrub ok
Jan 23 04:53:29 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:53:29 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:53:29 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 04:53:29 np0005593294 ceph-mon[80126]: Deploying daemon keepalived.nfs.cephfs.compute-2.pawaai on compute-2
Jan 23 04:53:29 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 23 04:53:29 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 93 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=92/93 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:29 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 93 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 crt=60'756 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:30 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e400a3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:30 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.5 scrub starts
Jan 23 04:53:30 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.5 scrub ok
Jan 23 04:53:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:30 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:30 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:31 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.14 deep-scrub starts
Jan 23 04:53:31 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.14 deep-scrub ok
Jan 23 04:53:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:32 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:32 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Jan 23 04:53:32 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Jan 23 04:53:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:32 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e400a3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:32 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:32 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:32 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:32 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:32 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:32 np0005593294 ceph-mon[80126]: Deploying daemon alertmanager.compute-0 on compute-0
Jan 23 04:53:33 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Jan 23 04:53:33 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Jan 23 04:53:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:34 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:34 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Jan 23 04:53:34 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Jan 23 04:53:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:34 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:34 np0005593294 systemd-logind[807]: New session 36 of user zuul.
Jan 23 04:53:34 np0005593294 systemd[1]: Started Session 36 of User zuul.
Jan 23 04:53:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:34 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8002060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:35 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.9 scrub starts
Jan 23 04:53:35 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.9 scrub ok
Jan 23 04:53:35 np0005593294 python3.9[85861]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:53:36 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 23 04:53:36 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 94 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94) [0] r=0 lpr=94 pi=[73,94)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:36 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 94 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94) [0] r=0 lpr=94 pi=[73,94)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 23 04:53:36 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:36 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:36 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.a scrub starts
Jan 23 04:53:36 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.a scrub ok
Jan 23 04:53:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:36 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc000f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:36 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:37 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.b deep-scrub starts
Jan 23 04:53:37 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.b deep-scrub ok
Jan 23 04:53:37 np0005593294 python3.9[86077]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:53:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:38 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8002060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:38 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Jan 23 04:53:38 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Jan 23 04:53:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:38 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 23 04:53:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 23 04:53:38 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:38 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 95 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=-1 lpr=95 pi=[73,95)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:38 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 95 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=-1 lpr=95 pi=[73,95)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:38 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 95 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=-1 lpr=95 pi=[73,95)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:38 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 95 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=-1 lpr=95 pi=[73,95)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:38 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc001ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: Regenerating cephadm self-signed grafana TLS certificates
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: Deploying daemon grafana.compute-0 on compute-0
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 04:53:39 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 23 04:53:39 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 96 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=96) [0] r=0 lpr=96 pi=[78,96)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:39 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 96 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=96) [0] r=0 lpr=96 pi=[78,96)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:40 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:40 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8002060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:40 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:40 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 23 04:53:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 04:53:40 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 04:53:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=97) [0]/[1] r=-1 lpr=97 pi=[78,97)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=97) [0]/[1] r=-1 lpr=97 pi=[78,97)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=97) [0]/[1] r=-1 lpr=97 pi=[78,97)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=97) [0]/[1] r=-1 lpr=97 pi=[78,97)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 23 04:53:42 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 98 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=97/98 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:42 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 98 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=97/98 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.068169) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022068451, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7456, "num_deletes": 256, "total_data_size": 18505937, "memory_usage": 19292528, "flush_reason": "Manual Compaction"}
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 23 04:53:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:42 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc001ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022162814, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11454816, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 251, "largest_seqno": 7461, "table_properties": {"data_size": 11424585, "index_size": 19300, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9861, "raw_key_size": 96698, "raw_average_key_size": 24, "raw_value_size": 11348944, "raw_average_value_size": 2880, "num_data_blocks": 851, "num_entries": 3940, "num_filter_entries": 3940, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 1769161847, "file_creation_time": 1769162022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 94624 microseconds, and 26163 cpu microseconds.
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.162890) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11454816 bytes OK
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.162931) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.164474) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.164520) EVENT_LOG_v1 {"time_micros": 1769162022164515, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.164543) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18464253, prev total WAL file size 18464253, number of live WAL files 2.
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.168587) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(1773B)]
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022168742, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11456589, "oldest_snapshot_seqno": -1}
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3688 keys, 11451458 bytes, temperature: kUnknown
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022238725, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11451458, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11421854, "index_size": 19254, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 92454, "raw_average_key_size": 25, "raw_value_size": 11349349, "raw_average_value_size": 3077, "num_data_blocks": 850, "num_entries": 3688, "num_filter_entries": 3688, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.239082) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11451458 bytes
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.241126) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 163.5 rd, 163.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.9, 0.0 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3945, records dropped: 257 output_compression: NoCompression
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.241162) EVENT_LOG_v1 {"time_micros": 1769162022241146, "job": 4, "event": "compaction_finished", "compaction_time_micros": 70091, "compaction_time_cpu_micros": 27297, "output_level": 6, "num_output_files": 1, "total_output_size": 11451458, "num_input_records": 3945, "num_output_records": 3688, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022244782, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022245029, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.168407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:42 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:42 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8002060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:43 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 23 04:53:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 23 04:53:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 99 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=0/0 n=8 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 luod=0'0 crt=62'768 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 99 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=0/0 n=8 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 crt=62'768 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 99 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=0/0 n=5 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 99 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=0/0 n=5 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:44 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 23 04:53:44 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 23 04:53:44 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100 pruub=12.086140633s) [2] r=-1 lpr=100 pi=[82,100)/1 crt=62'771 mlcod 0'0 active pruub 244.444183350s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:44 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100 pruub=12.085646629s) [2] r=-1 lpr=100 pi=[82,100)/1 crt=62'761 mlcod 0'0 active pruub 244.444198608s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:44 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100 pruub=12.085577011s) [2] r=-1 lpr=100 pi=[82,100)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 244.444198608s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:44 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100 pruub=12.085281372s) [2] r=-1 lpr=100 pi=[82,100)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 244.444183350s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:44 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=99/100 n=8 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 crt=62'768 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:44 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=99/100 n=5 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:44 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.c scrub starts
Jan 23 04:53:44 np0005593294 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.c scrub ok
Jan 23 04:53:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc001ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:45 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 23 04:53:45 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 101 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=0 lpr=101 pi=[82,101)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:45 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 101 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=0 lpr=101 pi=[82,101)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:45 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 101 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=0 lpr=101 pi=[82,101)/1 crt=62'761 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:45 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 101 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=0 lpr=101 pi=[82,101)/1 crt=62'761 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:45 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 23 04:53:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:46 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8003550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:46 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:46 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 23 04:53:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 102 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=101/102 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] async=[2] r=0 lpr=101 pi=[82,101)/1 crt=62'761 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 102 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=101/102 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] async=[2] r=0 lpr=101 pi=[82,101)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:48 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:48 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8003550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:48 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:52 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc002f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:52 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:52 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 23 04:53:53 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 04:53:53 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 04:53:53 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 04:53:53 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=101/102 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103 pruub=9.513637543s) [2] async=[2] r=-1 lpr=103 pi=[82,103)/1 crt=62'771 mlcod 62'771 active pruub 251.297988892s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:53 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=101/102 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103 pruub=9.513319016s) [2] r=-1 lpr=103 pi=[82,103)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 251.297988892s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:53 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=101/102 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103 pruub=9.508596420s) [2] async=[2] r=-1 lpr=103 pi=[82,103)/1 crt=62'761 mlcod 62'761 active pruub 251.294219971s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:53 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=101/102 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103 pruub=9.508519173s) [2] r=-1 lpr=103 pi=[82,103)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 251.294219971s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:53 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=103 pruub=15.795506477s) [2] r=-1 lpr=103 pi=[59,103)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 257.581420898s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:53 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=103 pruub=15.795460701s) [2] r=-1 lpr=103 pi=[59,103)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 257.581420898s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:54 np0005593294 kernel: ganesha.nfsd[85705]: segfault at 50 ip 00007faa6d51732e sp 00007fa9e3ffe210 error 4 in libntirpc.so.5.8[7faa6d4fc000+2c000] likely on CPU 3 (core 0, socket 3)
Jan 23 04:53:54 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 04:53:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:54 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy ignored for local
Jan 23 04:53:54 np0005593294 systemd[1]: Created slice Slice /system/systemd-coredump.
Jan 23 04:53:54 np0005593294 systemd[1]: Started Process Core Dump (PID 86143/UID 0).
Jan 23 04:53:54 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 23 04:53:54 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 04:53:54 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 04:53:54 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 04:53:54 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 104 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] r=0 lpr=104 pi=[59,104)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:54 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 104 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] r=0 lpr=104 pi=[59,104)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:55 np0005593294 systemd-coredump[86144]: Process 84917 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 55:#012#0  0x00007faa6d51732e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007faa6d521900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 23 04:53:55 np0005593294 systemd[1]: systemd-coredump@0-86143-0.service: Deactivated successfully.
Jan 23 04:53:55 np0005593294 systemd[1]: systemd-coredump@0-86143-0.service: Consumed 1.231s CPU time.
Jan 23 04:53:55 np0005593294 podman[86150]: 2026-01-23 09:53:55.534004367 +0000 UTC m=+0.039553749 container died 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 23 04:53:55 np0005593294 systemd[1]: var-lib-containers-storage-overlay-c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3-merged.mount: Deactivated successfully.
Jan 23 04:53:55 np0005593294 systemd[82140]: Starting Mark boot as successful...
Jan 23 04:53:55 np0005593294 podman[86150]: 2026-01-23 09:53:55.586359054 +0000 UTC m=+0.091908436 container remove 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:53:55 np0005593294 systemd[82140]: Finished Mark boot as successful.
Jan 23 04:53:55 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 04:53:55 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 23 04:53:55 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 105 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=104/105 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] async=[2] r=0 lpr=104 pi=[59,104)/1 crt=58'754 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:55 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 23 04:53:55 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 04:53:55 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.782s CPU time.
Jan 23 04:53:56 np0005593294 systemd[1]: session-36.scope: Deactivated successfully.
Jan 23 04:53:56 np0005593294 systemd[1]: session-36.scope: Consumed 8.579s CPU time.
Jan 23 04:53:56 np0005593294 systemd-logind[807]: Session 36 logged out. Waiting for processes to exit.
Jan 23 04:53:56 np0005593294 systemd-logind[807]: Removed session 36.
Jan 23 04:53:56 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 23 04:53:56 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 106 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=104/105 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106 pruub=14.980967522s) [2] async=[2] r=-1 lpr=106 pi=[59,106)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 259.864685059s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:56 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 106 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=104/105 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106 pruub=14.980821609s) [2] r=-1 lpr=106 pi=[59,106)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 259.864685059s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 23 04:53:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593294 ceph-mon[80126]: Deploying daemon haproxy.rgw.default.compute-0.qabsws on compute-0
Jan 23 04:53:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 23 04:53:57 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 23 04:53:57 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 23 04:53:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:53:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.005000163s ======
Jan 23 04:53:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:58.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000163s
Jan 23 04:53:58 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:58 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:58 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:58 np0005593294 ceph-mon[80126]: Deploying daemon haproxy.rgw.default.compute-2.izjwnk on compute-2
Jan 23 04:53:59 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 23 04:53:59 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 23 04:53:59 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 23 04:54:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095400 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:54:00 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 23 04:54:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:00.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:00.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:00 np0005593294 ceph-mon[80126]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 23 04:54:00 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:00 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:00 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:00 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:00 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:54:00 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:54:00 np0005593294 ceph-mon[80126]: Deploying daemon keepalived.rgw.default.compute-0.tytkrd on compute-0
Jan 23 04:54:01 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 23 04:54:01 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 23 04:54:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:54:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:02.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:54:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:02.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:03 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:03 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:03 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:03 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:54:03 np0005593294 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:54:03 np0005593294 ceph-mon[80126]: Deploying daemon keepalived.rgw.default.compute-2.qpmsjd on compute-2
Jan 23 04:54:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 23 04:54:04 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:04 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:04 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:04 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:04.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:04.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:05 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 23 04:54:05 np0005593294 ceph-mon[80126]: Deploying daemon prometheus.compute-0 on compute-0
Jan 23 04:54:05 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 23 04:54:05 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 1.
Jan 23 04:54:05 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:54:05 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.782s CPU time.
Jan 23 04:54:05 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:54:06 np0005593294 podman[86246]: 2026-01-23 09:54:06.164848925 +0000 UTC m=+0.051647994 container create 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 23 04:54:06 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:06 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:06 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:06 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:06 np0005593294 podman[86246]: 2026-01-23 09:54:06.229886428 +0000 UTC m=+0.116685547 container init 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:54:06 np0005593294 podman[86246]: 2026-01-23 09:54:06.139382659 +0000 UTC m=+0.026181778 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:54:06 np0005593294 podman[86246]: 2026-01-23 09:54:06.235026879 +0000 UTC m=+0.121825988 container start 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 04:54:06 np0005593294 bash[86246]: 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd
Jan 23 04:54:06 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:54:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 04:54:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 04:54:06 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 23 04:54:06 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:06 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 23 04:54:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 04:54:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 04:54:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 04:54:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 04:54:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 04:54:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:54:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:06.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:06.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:07 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 23 04:54:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 23 04:54:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:08.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:08.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:08 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 23 04:54:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 23 04:54:10 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 23 04:54:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:10.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:10.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095410 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:54:12 np0005593294 systemd-logind[807]: New session 37 of user zuul.
Jan 23 04:54:12 np0005593294 systemd[1]: Started Session 37 of User zuul.
Jan 23 04:54:12 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:12 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:12 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:12 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  1: '-n'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  2: 'mgr.compute-1.jmakme'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  3: '-f'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  4: '--setuser'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  5: 'ceph'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  6: '--setgroup'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  7: 'ceph'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr respawn  exe_path /proc/self/exe
Jan 23 04:54:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setuser ceph since I am not root
Jan 23 04:54:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setgroup ceph since I am not root
Jan 23 04:54:12 np0005593294 systemd[1]: session-34.scope: Deactivated successfully.
Jan 23 04:54:12 np0005593294 systemd[1]: session-34.scope: Consumed 20.705s CPU time.
Jan 23 04:54:12 np0005593294 systemd-logind[807]: Session 34 logged out. Waiting for processes to exit.
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:54:12 np0005593294 systemd-logind[807]: Removed session 34.
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 04:54:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:54:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:54:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 04:54:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:12.487+0000 7f26bfd6e140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:54:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:12.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:54:12 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 04:54:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:12.576+0000 7f26bfd6e140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:54:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:12.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:12 np0005593294 python3.9[86480]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 04:54:13 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 04:54:13 np0005593294 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:54:13 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 04:54:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:13.435+0000 7f26bfd6e140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593294 python3.9[86666]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:54:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:14.112+0000 7f26bfd6e140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:54:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:54:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:  from numpy import show_config as show_numpy_config
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 04:54:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:14.280+0000 7f26bfd6e140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 04:54:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:14.348+0000 7f26bfd6e140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:54:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:14.477+0000 7f26bfd6e140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:14.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:14.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 04:54:14 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 04:54:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.460+0000 7f26bfd6e140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:54:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.688+0000 7f26bfd6e140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:54:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.759+0000 7f26bfd6e140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 04:54:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.821+0000 7f26bfd6e140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:54:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.893+0000 7f26bfd6e140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 04:54:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.966+0000 7f26bfd6e140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:54:15 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 04:54:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:16.321+0000 7f26bfd6e140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593294 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:54:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:16.410+0000 7f26bfd6e140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593294 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 04:54:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:16.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.597773) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056597880, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1002, "num_deletes": 251, "total_data_size": 2240223, "memory_usage": 2276272, "flush_reason": "Manual Compaction"}
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056616142, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1428848, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7466, "largest_seqno": 8463, "table_properties": {"data_size": 1424071, "index_size": 2301, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10217, "raw_average_key_size": 18, "raw_value_size": 1414138, "raw_average_value_size": 2566, "num_data_blocks": 102, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162023, "oldest_key_time": 1769162023, "file_creation_time": 1769162056, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 18419 microseconds, and 6502 cpu microseconds.
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.616203) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1428848 bytes OK
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.616229) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.617827) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.617880) EVENT_LOG_v1 {"time_micros": 1769162056617870, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.617904) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 2235010, prev total WAL file size 2235010, number of live WAL files 2.
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.619164) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1395KB)], [15(10MB)]
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056619265, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12880306, "oldest_snapshot_seqno": -1}
Jan 23 04:54:16 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 04:54:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:16.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3712 keys, 12441169 bytes, temperature: kUnknown
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056786078, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12441169, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12410499, "index_size": 20320, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 94995, "raw_average_key_size": 25, "raw_value_size": 12336481, "raw_average_value_size": 3323, "num_data_blocks": 879, "num_entries": 3712, "num_filter_entries": 3712, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162056, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.786350) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12441169 bytes
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.787569) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 77.2 rd, 74.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.9 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(17.7) write-amplify(8.7) OK, records in: 4239, records dropped: 527 output_compression: NoCompression
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.787592) EVENT_LOG_v1 {"time_micros": 1769162056787582, "job": 6, "event": "compaction_finished", "compaction_time_micros": 166887, "compaction_time_cpu_micros": 32208, "output_level": 6, "num_output_files": 1, "total_output_size": 12441169, "num_input_records": 4239, "num_output_records": 3712, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056787967, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056790216, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.619045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790249) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790253) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790255) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:16.861+0000 7f26bfd6e140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593294 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 04:54:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:54:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:54:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:54:17 np0005593294 python3.9[86823]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:54:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.429+0000 7f26bfd6e140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 04:54:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.501+0000 7f26bfd6e140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:54:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.578+0000 7f26bfd6e140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 04:54:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 04:54:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.730+0000 7f26bfd6e140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 04:54:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.808+0000 7f26bfd6e140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 04:54:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.981+0000 7f26bfd6e140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:54:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:18.220+0000 7f26bfd6e140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 04:54:18 np0005593294 python3.9[86977]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:54:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:18.515+0000 7f26bfd6e140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 04:54:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:18.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:18.590+0000 7f26bfd6e140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: mgr load Constructed class from module: dashboard
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: mgr load Constructed class from module: prometheus
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [prometheus INFO root] server_addr: :: server_port: 9283
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [prometheus INFO root] Starting engine...
Jan 23 04:54:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: [23/Jan/2026:09:54:18] ENGINE Bus STARTING
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:18] ENGINE Bus STARTING
Jan 23 04:54:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: CherryPy Checker:
Jan 23 04:54:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: The Application mounted at '' has an empty config.
Jan 23 04:54:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x55fdcbbb1860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [dashboard INFO root] Starting engine...
Jan 23 04:54:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: [23/Jan/2026:09:54:18] ENGINE Serving on http://:::9283
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [dashboard INFO root] Engine started...
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:18] ENGINE Serving on http://:::9283
Jan 23 04:54:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: [23/Jan/2026:09:54:18] ENGINE Bus STARTED
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:18] ENGINE Bus STARTED
Jan 23 04:54:18 np0005593294 ceph-mgr[80432]: [prometheus INFO root] Engine started.
Jan 23 04:54:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:18.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:19 np0005593294 python3.9[87155]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:54:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 23 04:54:19 np0005593294 ceph-mon[80126]: Active manager daemon compute-0.nbdygh restarted
Jan 23 04:54:19 np0005593294 ceph-mon[80126]: Activating manager daemon compute-0.nbdygh
Jan 23 04:54:20 np0005593294 python3.9[87308]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:54:20 np0005593294 systemd-logind[807]: New session 38 of user ceph-admin.
Jan 23 04:54:20 np0005593294 systemd[1]: Started Session 38 of User ceph-admin.
Jan 23 04:54:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:20.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:20.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:20 np0005593294 ceph-mon[80126]: Manager daemon compute-0.nbdygh is now available
Jan 23 04:54:20 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:20 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:20 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 04:54:20 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 04:54:21 np0005593294 python3.9[87552]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:54:21 np0005593294 podman[87583]: 2026-01-23 09:54:21.188940235 +0000 UTC m=+0.072641741 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 23 04:54:21 np0005593294 network[87619]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:54:21 np0005593294 network[87620]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:54:21 np0005593294 network[87621]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:54:21 np0005593294 podman[87583]: 2026-01-23 09:54:21.31837422 +0000 UTC m=+0.202075756 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 23 04:54:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095421 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:54:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [NOTICE] 022/095421 (4) : haproxy version is 2.3.17-d1c9119
Jan 23 04:54:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [NOTICE] 022/095421 (4) : path to executable is /usr/local/sbin/haproxy
Jan 23 04:54:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [ALERT] 022/095421 (4) : backend 'backend' has no server available!
Jan 23 04:54:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 23 04:54:22 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Bus STARTING
Jan 23 04:54:22 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Serving on http://192.168.122.100:8765
Jan 23 04:54:22 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Serving on https://192.168.122.100:7150
Jan 23 04:54:22 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Bus STARTED
Jan 23 04:54:22 np0005593294 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Client ('192.168.122.100', 52034) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 04:54:22 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 23 04:54:22 np0005593294 podman[87747]: 2026-01-23 09:54:22.332716862 +0000 UTC m=+0.062650119 container exec 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:22 np0005593294 podman[87747]: 2026-01-23 09:54:22.36915603 +0000 UTC m=+0.099089237 container exec_died 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:22.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:22 np0005593294 podman[87859]: 2026-01-23 09:54:22.687365606 +0000 UTC m=+0.055565038 container exec 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 04:54:22 np0005593294 podman[87859]: 2026-01-23 09:54:22.700732434 +0000 UTC m=+0.068931866 container exec_died 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 04:54:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:22.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:22 np0005593294 podman[87927]: 2026-01-23 09:54:22.924282131 +0000 UTC m=+0.053441262 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 04:54:22 np0005593294 podman[87927]: 2026-01-23 09:54:22.933654364 +0000 UTC m=+0.062813505 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 04:54:23 np0005593294 podman[88004]: 2026-01-23 09:54:23.153342759 +0000 UTC m=+0.056827747 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.component=keepalived-container, description=keepalived for Ceph, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.buildah.version=1.28.2, name=keepalived, io.openshift.tags=Ceph keepalived, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793)
Jan 23 04:54:23 np0005593294 podman[88004]: 2026-01-23 09:54:23.190940135 +0000 UTC m=+0.094425083 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.buildah.version=1.28.2, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, description=keepalived for Ceph, version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.expose-services=)
Jan 23 04:54:23 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:54:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:54:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa538000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:24.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:24.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:24 np0005593294 python3.9[88370]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:54:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 23 04:54:25 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 23 04:54:25 np0005593294 python3.9[88537]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:54:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095426 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:54:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 23 04:54:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528000f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:26.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:26 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 23 04:54:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:54:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:54:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:26.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:27 np0005593294 python3.9[88692]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:54:27 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 23 04:54:27 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:27 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:27 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:28 np0005593294 python3.9[88974]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:54:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:28.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:28.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:54:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 23 04:54:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:29 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 122 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=122 pruub=14.288291931s) [1] r=-1 lpr=122 pi=[89,122)/1 crt=62'771 mlcod 0'0 active pruub 291.752838135s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:29 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 122 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=122 pruub=14.288235664s) [1] r=-1 lpr=122 pi=[89,122)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 291.752838135s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:29 np0005593294 python3.9[89429]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:54:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:29 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:54:29 np0005593294 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:54:29 np0005593294 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:54:29 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 23 04:54:29 np0005593294 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:54:29 np0005593294 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:54:29 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 23 04:54:29 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 123 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=123) [1]/[0] r=0 lpr=123 pi=[89,123)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:29 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 123 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=123) [1]/[0] r=0 lpr=123 pi=[89,123)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:30.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 04:54:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:30.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:54:30 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 124 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=123/124 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=123) [1]/[0] async=[1] r=0 lpr=123 pi=[89,123)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:54:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:30 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 23 04:54:30 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 125 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=123/124 n=7 ec=59/46 lis/c=123/89 les/c/f=124/90/0 sis=125 pruub=15.890355110s) [1] async=[1] r=-1 lpr=125 pi=[89,125)/1 crt=62'771 mlcod 62'771 active pruub 295.053192139s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:30 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 125 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=123/124 n=7 ec=59/46 lis/c=123/89 les/c/f=124/90/0 sis=125 pruub=15.890196800s) [1] r=-1 lpr=125 pi=[89,125)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 295.053192139s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:31 np0005593294 ceph-mon[80126]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 04:54:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 04:54:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 04:54:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:54:31 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 23 04:54:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5300029b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:32.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:32.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095432 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:54:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:32 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 23 04:54:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 23 04:54:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:54:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:54:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:54:33 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 127 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=127 pruub=15.797190666s) [1] r=-1 lpr=127 pi=[92,127)/1 crt=60'756 mlcod 0'0 active pruub 297.948699951s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:33 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 127 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=127 pruub=15.797089577s) [1] r=-1 lpr=127 pi=[92,127)/1 crt=60'756 mlcod 0'0 unknown NOTIFY pruub 297.948699951s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:34 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 23 04:54:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 23 04:54:34 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 128 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=128) [1]/[0] r=0 lpr=128 pi=[92,128)/1 crt=60'756 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:34 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 128 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=128) [1]/[0] r=0 lpr=128 pi=[92,128)/1 crt=60'756 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5300032d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:34.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:34.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:35 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 23 04:54:35 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:35 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 23 04:54:35 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 129 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=128/129 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=128) [1]/[0] async=[1] r=0 lpr=128 pi=[92,128)/1 crt=60'756 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:54:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:36 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 23 04:54:36 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 130 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=128/129 n=2 ec=59/46 lis/c=128/92 les/c/f=129/93/0 sis=130 pruub=14.894608498s) [1] async=[1] r=-1 lpr=130 pi=[92,130)/1 crt=60'756 mlcod 60'756 active pruub 299.372436523s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:36 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 130 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=128/129 n=2 ec=59/46 lis/c=128/92 les/c/f=129/93/0 sis=130 pruub=14.894343376s) [1] r=-1 lpr=130 pi=[92,130)/1 crt=60'756 mlcod 0'0 unknown NOTIFY pruub 299.372436523s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:36 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 23 04:54:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:36.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:36.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:37 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:54:37 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:37 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:37 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:37 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 23 04:54:37 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:54:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 23 04:54:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c000e00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:38 np0005593294 ceph-mon[80126]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 23 04:54:38 np0005593294 ceph-mon[80126]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 23 04:54:38 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 23 04:54:38 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:38 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:38 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.nbdygh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:54:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:38.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:38.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:39 np0005593294 ceph-mon[80126]: Reconfiguring mgr.compute-0.nbdygh (monmap changed)...
Jan 23 04:54:39 np0005593294 ceph-mon[80126]: Reconfiguring daemon mgr.compute-0.nbdygh on compute-0
Jan 23 04:54:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:54:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 23 04:54:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:39 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 23 04:54:39 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 132 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=132 pruub=10.343003273s) [2] r=-1 lpr=132 pi=[80,132)/1 crt=62'763 mlcod 0'0 active pruub 298.424377441s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:39 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 132 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=132 pruub=10.342778206s) [2] r=-1 lpr=132 pi=[80,132)/1 crt=62'763 mlcod 0'0 unknown NOTIFY pruub 298.424377441s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5100016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c001920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:40 np0005593294 ceph-mon[80126]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 23 04:54:40 np0005593294 ceph-mon[80126]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 23 04:54:40 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:40 np0005593294 ceph-mon[80126]: Reconfiguring osd.1 (monmap changed)...
Jan 23 04:54:40 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 23 04:54:40 np0005593294 ceph-mon[80126]: Reconfiguring daemon osd.1 on compute-0
Jan 23 04:54:40 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 23 04:54:40 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:40.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:40 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 23 04:54:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 133 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] r=0 lpr=133 pi=[80,133)/1 crt=62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:40 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 133 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] r=0 lpr=133 pi=[80,133)/1 crt=62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:40.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:41 np0005593294 ceph-mon[80126]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Jan 23 04:54:41 np0005593294 ceph-mon[80126]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Jan 23 04:54:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:54:41 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 23 04:54:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 134 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=134) [0] r=0 lpr=134 pi=[103,134)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:41 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 134 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=133/134 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] async=[2] r=0 lpr=133 pi=[80,133)/1 crt=62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:54:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5100016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:42.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:42.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c001920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 23 04:54:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 135 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=133/134 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135 pruub=14.816746712s) [2] async=[2] r=-1 lpr=135 pi=[80,135)/1 crt=62'763 mlcod 62'763 active pruub 306.180419922s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 135 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=133/134 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135 pruub=14.816671371s) [2] r=-1 lpr=135 pi=[80,135)/1 crt=62'763 mlcod 0'0 unknown NOTIFY pruub 306.180419922s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 135 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] r=-1 lpr=135 pi=[103,135)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:43 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 135 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] r=-1 lpr=135 pi=[103,135)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095443 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:54:43 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:54:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:44.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:44.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5100016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:45 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 23 04:54:46 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 23 04:54:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 137 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137) [0] r=0 lpr=137 pi=[103,137)/1 luod=0'0 crt=62'761 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:46 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 137 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137) [0] r=0 lpr=137 pi=[103,137)/1 crt=62'761 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c001920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:46.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:46 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:46 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:46 np0005593294 ceph-mon[80126]: Reconfiguring grafana.compute-0 (dependencies changed)...
Jan 23 04:54:46 np0005593294 ceph-mon[80126]: Reconfiguring daemon grafana.compute-0 on compute-0
Jan 23 04:54:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:46.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 23 04:54:47 np0005593294 ceph-osd[77616]: osd.0 pg_epoch: 138 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=137/138 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137) [0] r=0 lpr=137 pi=[103,137)/1 crt=62'761 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:54:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c002db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:48.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:48.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:48 np0005593294 podman[90071]: 2026-01-23 09:54:48.811204544 +0000 UTC m=+0.072750805 container create 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:54:48 np0005593294 systemd[1]: Started libpod-conmon-19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401.scope.
Jan 23 04:54:48 np0005593294 podman[90071]: 2026-01-23 09:54:48.783930872 +0000 UTC m=+0.045477183 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:54:48 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:54:48 np0005593294 podman[90071]: 2026-01-23 09:54:48.908866256 +0000 UTC m=+0.170412527 container init 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 23 04:54:48 np0005593294 podman[90071]: 2026-01-23 09:54:48.917863167 +0000 UTC m=+0.179409418 container start 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 23 04:54:48 np0005593294 podman[90071]: 2026-01-23 09:54:48.921429409 +0000 UTC m=+0.182975750 container attach 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 04:54:48 np0005593294 vigilant_lederberg[90088]: 167 167
Jan 23 04:54:48 np0005593294 systemd[1]: libpod-19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401.scope: Deactivated successfully.
Jan 23 04:54:48 np0005593294 conmon[90088]: conmon 19ee01b7379ed6f34fbc <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401.scope/container/memory.events
Jan 23 04:54:48 np0005593294 podman[90071]: 2026-01-23 09:54:48.926229589 +0000 UTC m=+0.187775840 container died 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 23 04:54:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:48 np0005593294 systemd[1]: var-lib-containers-storage-overlay-31799003e0affa5ff836fc49d3057b4f51eef1f354c9380b2fea3a2c64b052d6-merged.mount: Deactivated successfully.
Jan 23 04:54:48 np0005593294 podman[90071]: 2026-01-23 09:54:48.973716443 +0000 UTC m=+0.235262694 container remove 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:54:48 np0005593294 systemd[1]: libpod-conmon-19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401.scope: Deactivated successfully.
Jan 23 04:54:49 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:49 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:49 np0005593294 ceph-mon[80126]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 23 04:54:49 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:54:49 np0005593294 ceph-mon[80126]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 23 04:54:49 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:49 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:49 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 23 04:54:49 np0005593294 podman[90174]: 2026-01-23 09:54:49.543375588 +0000 UTC m=+0.046772314 container create c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 04:54:49 np0005593294 systemd[1]: Started libpod-conmon-c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d.scope.
Jan 23 04:54:49 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:54:49 np0005593294 podman[90174]: 2026-01-23 09:54:49.523127544 +0000 UTC m=+0.026524300 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:54:49 np0005593294 podman[90174]: 2026-01-23 09:54:49.625835164 +0000 UTC m=+0.129231980 container init c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 04:54:49 np0005593294 podman[90174]: 2026-01-23 09:54:49.631723968 +0000 UTC m=+0.135120694 container start c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 04:54:49 np0005593294 suspicious_germain[90189]: 167 167
Jan 23 04:54:49 np0005593294 systemd[1]: libpod-c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d.scope: Deactivated successfully.
Jan 23 04:54:49 np0005593294 podman[90174]: 2026-01-23 09:54:49.636416655 +0000 UTC m=+0.139813381 container attach c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Jan 23 04:54:49 np0005593294 podman[90174]: 2026-01-23 09:54:49.636983783 +0000 UTC m=+0.140380509 container died c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 04:54:49 np0005593294 systemd[1]: var-lib-containers-storage-overlay-f305bf7e8e7cb32a61126d20d70c5e49e56dd18c574b01be6a816ddedbbf10a8-merged.mount: Deactivated successfully.
Jan 23 04:54:49 np0005593294 podman[90174]: 2026-01-23 09:54:49.675583729 +0000 UTC m=+0.178980455 container remove c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 04:54:49 np0005593294 systemd[1]: libpod-conmon-c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d.scope: Deactivated successfully.
Jan 23 04:54:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:50 np0005593294 podman[90280]: 2026-01-23 09:54:50.288250958 +0000 UTC m=+0.043812501 container create 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:54:50 np0005593294 systemd[1]: Started libpod-conmon-6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf.scope.
Jan 23 04:54:50 np0005593294 systemd[1]: Started libcrun container.
Jan 23 04:54:50 np0005593294 podman[90280]: 2026-01-23 09:54:50.352385142 +0000 UTC m=+0.107946715 container init 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 23 04:54:50 np0005593294 podman[90280]: 2026-01-23 09:54:50.358779161 +0000 UTC m=+0.114340734 container start 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 04:54:50 np0005593294 vigilant_ishizaka[90296]: 167 167
Jan 23 04:54:50 np0005593294 podman[90280]: 2026-01-23 09:54:50.268935523 +0000 UTC m=+0.024497096 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:54:50 np0005593294 systemd[1]: libpod-6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf.scope: Deactivated successfully.
Jan 23 04:54:50 np0005593294 podman[90280]: 2026-01-23 09:54:50.36480039 +0000 UTC m=+0.120361983 container attach 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 04:54:50 np0005593294 podman[90280]: 2026-01-23 09:54:50.365083009 +0000 UTC m=+0.120644562 container died 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325)
Jan 23 04:54:50 np0005593294 ceph-mon[80126]: Reconfiguring osd.0 (monmap changed)...
Jan 23 04:54:50 np0005593294 ceph-mon[80126]: Reconfiguring daemon osd.0 on compute-1
Jan 23 04:54:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:54:50 np0005593294 systemd[1]: var-lib-containers-storage-overlay-f947111ae5e2f892803a770e79cef77868f171642ad71c41b77171b2f2206cd5-merged.mount: Deactivated successfully.
Jan 23 04:54:50 np0005593294 podman[90280]: 2026-01-23 09:54:50.407697991 +0000 UTC m=+0.163259534 container remove 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:54:50 np0005593294 systemd[1]: libpod-conmon-6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf.scope: Deactivated successfully.
Jan 23 04:54:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:50.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:50.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:50 np0005593294 systemd[1]: Stopping Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:54:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c002db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:51 np0005593294 podman[90411]: 2026-01-23 09:54:51.064071864 +0000 UTC m=+0.044340356 container died 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:51 np0005593294 systemd[1]: var-lib-containers-storage-overlay-1a9cf7871bcaed94d55bf97d20a317e95aa8ecd54623be987a412d3816ee0ab4-merged.mount: Deactivated successfully.
Jan 23 04:54:51 np0005593294 podman[90411]: 2026-01-23 09:54:51.107039718 +0000 UTC m=+0.087308210 container remove 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:51 np0005593294 bash[90411]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1
Jan 23 04:54:51 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@node-exporter.compute-1.service: Main process exited, code=exited, status=143/n/a
Jan 23 04:54:51 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@node-exporter.compute-1.service: Failed with result 'exit-code'.
Jan 23 04:54:51 np0005593294 systemd[1]: Stopped Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:54:51 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@node-exporter.compute-1.service: Consumed 2.430s CPU time.
Jan 23 04:54:51 np0005593294 systemd[1]: Starting Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:54:51 np0005593294 ceph-mon[80126]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 23 04:54:51 np0005593294 ceph-mon[80126]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 23 04:54:51 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:51 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:51 np0005593294 podman[90516]: 2026-01-23 09:54:51.551307762 +0000 UTC m=+0.046136912 container create 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:51 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e14a3ca03b62f0376285049141ffbb302ed3ee029612a5beb01e4b4d9873ce85/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:51 np0005593294 podman[90516]: 2026-01-23 09:54:51.614468537 +0000 UTC m=+0.109297707 container init 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:51 np0005593294 podman[90516]: 2026-01-23 09:54:51.619771032 +0000 UTC m=+0.114600192 container start 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:51 np0005593294 bash[90516]: 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78
Jan 23 04:54:51 np0005593294 podman[90516]: 2026-01-23 09:54:51.533486506 +0000 UTC m=+0.028315676 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.629Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.629Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Jan 23 04:54:51 np0005593294 systemd[1]: Started Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.632Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.632Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=arp
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=bcache
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=bonding
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=cpu
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=dmi
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=edac
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=entropy
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=filefd
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=hwmon
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=netclass
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=netdev
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=netstat
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=nfs
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=nvme
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=os
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=pressure
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=rapl
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=selinux
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=softnet
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=stat
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=textfile
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=thermal_zone
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=time
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=uname
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=xfs
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=zfs
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.637Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Jan 23 04:54:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.637Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Jan 23 04:54:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:52 np0005593294 ceph-mon[80126]: Reconfiguring node-exporter.compute-1 (unknown last config time)...
Jan 23 04:54:52 np0005593294 ceph-mon[80126]: Reconfiguring daemon node-exporter.compute-1 on compute-1
Jan 23 04:54:52 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:52 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:52 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:54:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:52.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:52.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:53 np0005593294 ceph-mon[80126]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 23 04:54:53 np0005593294 ceph-mon[80126]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 23 04:54:53 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:53 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:53 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:54:53 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:53 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:53 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 23 04:54:53 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c002db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:54 np0005593294 podman[90673]: 2026-01-23 09:54:54.239258531 +0000 UTC m=+0.077409860 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:54:54 np0005593294 podman[90673]: 2026-01-23 09:54:54.376985786 +0000 UTC m=+0.215137095 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:54:54 np0005593294 ceph-mon[80126]: Reconfiguring crash.compute-2 (unknown last config time)...
Jan 23 04:54:54 np0005593294 ceph-mon[80126]: Reconfiguring daemon crash.compute-2 on compute-2
Jan 23 04:54:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:54.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:54.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:54 np0005593294 podman[90795]: 2026-01-23 09:54:54.857604726 +0000 UTC m=+0.062198444 container exec 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:54 np0005593294 podman[90795]: 2026-01-23 09:54:54.865959787 +0000 UTC m=+0.070553505 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:55 np0005593294 podman[90885]: 2026-01-23 09:54:55.2173277 +0000 UTC m=+0.050647314 container exec 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:54:55 np0005593294 podman[90885]: 2026-01-23 09:54:55.230905094 +0000 UTC m=+0.064224708 container exec_died 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:54:55 np0005593294 podman[90949]: 2026-01-23 09:54:55.425437514 +0000 UTC m=+0.053039099 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 04:54:55 np0005593294 podman[90949]: 2026-01-23 09:54:55.43491291 +0000 UTC m=+0.062514465 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 04:54:55 np0005593294 podman[91017]: 2026-01-23 09:54:55.690241 +0000 UTC m=+0.114863451 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, name=keepalived, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.openshift.tags=Ceph keepalived, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph)
Jan 23 04:54:55 np0005593294 podman[91037]: 2026-01-23 09:54:55.781714348 +0000 UTC m=+0.070036160 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, distribution-scope=public, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.buildah.version=1.28.2)
Jan 23 04:54:55 np0005593294 podman[91017]: 2026-01-23 09:54:55.787076586 +0000 UTC m=+0.211699027 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, io.buildah.version=1.28.2, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=keepalived for Ceph, version=2.2.4, io.openshift.tags=Ceph keepalived, distribution-scope=public)
Jan 23 04:54:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:56.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:56.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:54:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:54:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:58.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:54:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:58.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:58 np0005593294 ceph-mon[80126]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Jan 23 04:54:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:00.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:00.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:02.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:02.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:03 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:55:03 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:55:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:04.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:04.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:55:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003d10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:06.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:55:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:06.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:55:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003d30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:08.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:55:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:08.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:55:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:10.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:10.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:12.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:12.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518002360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:14.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:14.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:16.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:16.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518002360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:18.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:18.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003070 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:20.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:20.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003070 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:22.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:22.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:23 np0005593294 python3.9[91307]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:55:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:24.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:24.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:25 np0005593294 python3.9[91595]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 04:55:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:26 np0005593294 python3.9[91748]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 04:55:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:26.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:26.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:27 np0005593294 python3.9[91900]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:55:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:28 np0005593294 python3.9[92053]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 04:55:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:28.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:28.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:29 np0005593294 python3.9[92206]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:55:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:30.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:30.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:31 np0005593294 python3.9[92358]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:55:31 np0005593294 python3.9[92437]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:55:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ef0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:32.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:32.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:32 np0005593294 python3.9[92589]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:55:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:34 np0005593294 python3.9[92744]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 04:55:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003f10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:34.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:34.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:34 np0005593294 python3.9[92897]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 04:55:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095535 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:55:35 np0005593294 python3.9[93051]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:55:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:36.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:36 np0005593294 python3.9[93203]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 04:55:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:36.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:37 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:37 np0005593294 python3.9[93381]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:55:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:38.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:38.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:39 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:39 np0005593294 systemd[1]: session-19.scope: Deactivated successfully.
Jan 23 04:55:39 np0005593294 systemd[1]: session-19.scope: Consumed 9.124s CPU time.
Jan 23 04:55:39 np0005593294 systemd-logind[807]: Session 19 logged out. Waiting for processes to exit.
Jan 23 04:55:39 np0005593294 systemd-logind[807]: Removed session 19.
Jan 23 04:55:40 np0005593294 python3.9[93536]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:55:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:40.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:40 np0005593294 python3.9[93690]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:55:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:40.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:41 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530001e00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:41 np0005593294 python3.9[93768]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:55:42 np0005593294 python3.9[93921]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:55:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:42 np0005593294 python3.9[93999]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:55:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:42.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:42.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:43 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:43 np0005593294 python3.9[94151]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:55:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:55:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530001e00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:44.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:44.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:44 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 23 04:55:44 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:44.943083) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:55:44 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 23 04:55:44 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162144943241, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2001, "num_deletes": 251, "total_data_size": 8190339, "memory_usage": 8441216, "flush_reason": "Manual Compaction"}
Jan 23 04:55:44 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145009671, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 5055840, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8468, "largest_seqno": 10464, "table_properties": {"data_size": 5047090, "index_size": 5308, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19703, "raw_average_key_size": 20, "raw_value_size": 5028702, "raw_average_value_size": 5332, "num_data_blocks": 236, "num_entries": 943, "num_filter_entries": 943, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162058, "oldest_key_time": 1769162058, "file_creation_time": 1769162144, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 66615 microseconds, and 14380 cpu microseconds.
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:55:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:45 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.009764) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 5055840 bytes OK
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.009807) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.057969) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.058082) EVENT_LOG_v1 {"time_micros": 1769162145058055, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.058115) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8180689, prev total WAL file size 8180689, number of live WAL files 2.
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.062426) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4937KB)], [18(11MB)]
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145062575, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17497009, "oldest_snapshot_seqno": -1}
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4121 keys, 13782054 bytes, temperature: kUnknown
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145176562, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 13782054, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13748468, "index_size": 22212, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10309, "raw_key_size": 105010, "raw_average_key_size": 25, "raw_value_size": 13666938, "raw_average_value_size": 3316, "num_data_blocks": 954, "num_entries": 4121, "num_filter_entries": 4121, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162145, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.176947) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 13782054 bytes
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.180383) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.3 rd, 120.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 11.9 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(6.2) write-amplify(2.7) OK, records in: 4655, records dropped: 534 output_compression: NoCompression
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.180417) EVENT_LOG_v1 {"time_micros": 1769162145180402, "job": 8, "event": "compaction_finished", "compaction_time_micros": 114164, "compaction_time_cpu_micros": 36470, "output_level": 6, "num_output_files": 1, "total_output_size": 13782054, "num_input_records": 4655, "num_output_records": 4121, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145181425, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145183481, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.062283) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183616) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183620) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183623) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:46 np0005593294 python3.9[94304]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:55:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530002b50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:46.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:46.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:47 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:47 np0005593294 python3.9[94456]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 04:55:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:47 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:55:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:47 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:55:48 np0005593294 python3.9[94607]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:55:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:48.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:48.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:49 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530002b50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:49 np0005593294 python3.9[94760]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:55:49 np0005593294 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 04:55:49 np0005593294 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 04:55:49 np0005593294 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 04:55:49 np0005593294 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 04:55:50 np0005593294 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 04:55:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:50.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:50.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:55:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:51 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:51 np0005593294 python3.9[94921]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 04:55:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530003860 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:52.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:52.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:53 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530003860 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:54.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:54.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:55 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095555 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:55:55 np0005593294 python3.9[95076]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:55:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:56.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:56.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:57 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:57 np0005593294 python3.9[95256]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:55:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:58 np0005593294 systemd[1]: session-37.scope: Deactivated successfully.
Jan 23 04:55:58 np0005593294 systemd[1]: session-37.scope: Consumed 1min 7.362s CPU time.
Jan 23 04:55:58 np0005593294 systemd-logind[807]: Session 37 logged out. Waiting for processes to exit.
Jan 23 04:55:58 np0005593294 systemd-logind[807]: Removed session 37.
Jan 23 04:55:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:58.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:55:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:58.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:59 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:00.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:00.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:01 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:02.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:02.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:03 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:04.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:56:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:04.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:56:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:05 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:06 np0005593294 systemd-logind[807]: New session 39 of user zuul.
Jan 23 04:56:06 np0005593294 systemd[1]: Started Session 39 of User zuul.
Jan 23 04:56:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:06.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:06.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:07 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:07 np0005593294 python3.9[95520]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:08.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:08.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:09 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:09 np0005593294 python3.9[95678]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 04:56:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:10.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:10.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:11 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:11 np0005593294 python3.9[95831]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:56:11 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:11 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:12 np0005593294 python3.9[95916]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:56:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:12.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:56:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:56:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:12.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:13 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280014c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:14.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:14.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:15 np0005593294 python3.9[96072]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:15 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280014c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:16.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:16.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:17 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:17 np0005593294 python3.9[96252]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:56:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280014c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:18 np0005593294 python3.9[96405]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:18.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:18.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:19 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:19 np0005593294 python3.9[96558]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 04:56:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:20.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:20 np0005593294 python3.9[96708]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:20.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:21 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280014c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095621 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:56:22 np0005593294 python3.9[96867]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:22.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:56:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:22.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:56:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:23 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:23 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:24 np0005593294 python3.9[97046]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:56:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:24.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:24.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:25 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:26 np0005593294 python3.9[97334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 04:56:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:26.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:26.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:27 np0005593294 python3.9[97484]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:56:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:27 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:27 np0005593294 python3.9[97639]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:28.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:28.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:29 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:30.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:30.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:31 np0005593294 python3.9[97793]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:31 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:32.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:56:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:32.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:56:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:56:33 np0005593294 python3.9[97948]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:56:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:34.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:34 np0005593294 python3.9[98102]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 23 04:56:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:34.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:35 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:35 np0005593294 systemd[1]: session-39.scope: Deactivated successfully.
Jan 23 04:56:35 np0005593294 systemd[1]: session-39.scope: Consumed 18.813s CPU time.
Jan 23 04:56:35 np0005593294 systemd-logind[807]: Session 39 logged out. Waiting for processes to exit.
Jan 23 04:56:35 np0005593294 systemd-logind[807]: Removed session 39.
Jan 23 04:56:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:56:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:56:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:36.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:36.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:37 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:38.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:56:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:39.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:56:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:39 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:39 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:56:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:56:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:40.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:56:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:41.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:41 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:41 np0005593294 systemd-logind[807]: New session 40 of user zuul.
Jan 23 04:56:41 np0005593294 systemd[1]: Started Session 40 of User zuul.
Jan 23 04:56:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:42 np0005593294 python3.9[98310]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:42.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095642 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:56:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:43.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:43 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510004340 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:43 np0005593294 python3.9[98464]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:56:44 np0005593294 kernel: ganesha.nfsd[91121]: segfault at 50 ip 00007fa5c06ee32e sp 00007fa53f7fd210 error 4 in libntirpc.so.5.8[7fa5c06d3000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 23 04:56:44 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 04:56:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy ignored for local
Jan 23 04:56:44 np0005593294 systemd[1]: Started Process Core Dump (PID 98558/UID 0).
Jan 23 04:56:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:44.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:44 np0005593294 python3.9[98662]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:56:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:45.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:45 np0005593294 systemd[1]: session-40.scope: Deactivated successfully.
Jan 23 04:56:45 np0005593294 systemd[1]: session-40.scope: Consumed 2.459s CPU time.
Jan 23 04:56:45 np0005593294 systemd-logind[807]: Session 40 logged out. Waiting for processes to exit.
Jan 23 04:56:45 np0005593294 systemd-logind[807]: Removed session 40.
Jan 23 04:56:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095645 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:56:45 np0005593294 systemd-coredump[98566]: Process 86266 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007fa5c06ee32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 04:56:45 np0005593294 systemd[1]: systemd-coredump@1-98558-0.service: Deactivated successfully.
Jan 23 04:56:45 np0005593294 systemd[1]: systemd-coredump@1-98558-0.service: Consumed 1.308s CPU time.
Jan 23 04:56:45 np0005593294 podman[98694]: 2026-01-23 09:56:45.87651893 +0000 UTC m=+0.045238319 container died 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 23 04:56:45 np0005593294 systemd[1]: var-lib-containers-storage-overlay-41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d-merged.mount: Deactivated successfully.
Jan 23 04:56:45 np0005593294 systemd[82140]: Created slice User Background Tasks Slice.
Jan 23 04:56:45 np0005593294 systemd[82140]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 04:56:45 np0005593294 systemd[82140]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 04:56:46 np0005593294 podman[98694]: 2026-01-23 09:56:46.053864249 +0000 UTC m=+0.222583608 container remove 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:56:46 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 04:56:46 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 04:56:46 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.185s CPU time.
Jan 23 04:56:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:46.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:47.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:48.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:49.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095650 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:56:50 np0005593294 systemd-logind[807]: New session 41 of user zuul.
Jan 23 04:56:50 np0005593294 systemd[1]: Started Session 41 of User zuul.
Jan 23 04:56:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:50.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:56:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:51.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:56:51 np0005593294 python3.9[98897]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:52 np0005593294 python3.9[99051]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:56:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:52.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:56:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:53.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:53 np0005593294 python3.9[99208]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:56:54 np0005593294 python3.9[99292]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:54.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:55.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:56 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 2.
Jan 23 04:56:56 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:56:56 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.185s CPU time.
Jan 23 04:56:56 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:56:56 np0005593294 podman[99425]: 2026-01-23 09:56:56.528141399 +0000 UTC m=+0.025387734 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:56:56 np0005593294 podman[99425]: 2026-01-23 09:56:56.732234008 +0000 UTC m=+0.229480323 container create 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:56:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:56.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:56 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 04:56:56 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:56:56 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:56:56 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:56:57 np0005593294 podman[99425]: 2026-01-23 09:56:57.004292627 +0000 UTC m=+0.501538972 container init 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 23 04:56:57 np0005593294 podman[99425]: 2026-01-23 09:56:57.010085234 +0000 UTC m=+0.507331549 container start 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:56:57 np0005593294 python3.9[99505]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:56:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 04:56:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 04:56:57 np0005593294 bash[99425]: 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743
Jan 23 04:56:57 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:56:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:57.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 04:56:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 04:56:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 04:56:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 04:56:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 04:56:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:56:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:58 np0005593294 python3.9[99770]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:56:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:58.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:56:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:56:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:59.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:56:59 np0005593294 python3.9[99923]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:57:00 np0005593294 python3.9[100089]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:57:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:00.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:57:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:01.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:01 np0005593294 python3.9[100167]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:02 np0005593294 python3.9[100320]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:02 np0005593294 python3.9[100398]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:57:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:02.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:57:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:03.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:03 np0005593294 python3.9[100551]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:04 np0005593294 python3.9[100703]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 04:57:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 04:57:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:57:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:57:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 04:57:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:57:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:57:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:57:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:04.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:04 np0005593294 python3.9[100855]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:05.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:06 np0005593294 python3.9[101008]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:57:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:06.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:57:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095707 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:57:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:57:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:07.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:57:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:07 np0005593294 python3.9[101161]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:57:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:08.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:09.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:10 np0005593294 python3.9[101315]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000000a:nfs.cephfs.0: -2
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 04:57:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:57:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:57:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:10.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:57:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:11.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:11 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:11 np0005593294 python3.9[101481]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:57:11 np0005593294 python3.9[101638]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:57:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:12 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:12 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:12 np0005593294 python3.9[101790]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:57:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:57:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:12.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:57:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:13.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:13 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:13 np0005593294 python3.9[101944]: ansible-service_facts Invoked
Jan 23 04:57:14 np0005593294 network[101961]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:57:14 np0005593294 network[101962]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:57:14 np0005593294 network[101963]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:57:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095714 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:57:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:14 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:14 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:57:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:14.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:57:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:57:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:15.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:57:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:15 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:16 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:16 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:57:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:16.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:57:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:57:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:17.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:57:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:17 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:18 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:18 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:18.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:19.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:19 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:20 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:20 np0005593294 python3.9[102443]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:57:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:20 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:20.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:21.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:21 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:22 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:22 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:22.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:23.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:23 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:23 np0005593294 python3.9[102680]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 04:57:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:57:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:57:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:57:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:57:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:24 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:24 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:24.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:25.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:25 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:25 np0005593294 python3.9[102832]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:25 np0005593294 python3.9[102911]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:26 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:26 np0005593294 python3.9[103063]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:26 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:57:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:26.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:57:27 np0005593294 python3.9[103141]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:57:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:27.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:57:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:27 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:28 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:28 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:28 np0005593294 python3.9[103294]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:57:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:28.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:57:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:29.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:29 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:57:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:57:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:30 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:30 np0005593294 python3.9[103472]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:57:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:30 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:30.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:31.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:31 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:31 np0005593294 python3.9[103556]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:57:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:32 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:32 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:32 np0005593294 systemd-logind[807]: Session 41 logged out. Waiting for processes to exit.
Jan 23 04:57:32 np0005593294 systemd[1]: session-41.scope: Deactivated successfully.
Jan 23 04:57:32 np0005593294 systemd[1]: session-41.scope: Consumed 24.394s CPU time.
Jan 23 04:57:32 np0005593294 systemd-logind[807]: Removed session 41.
Jan 23 04:57:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:32.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:33.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:33 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:34 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:34 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:34.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:35.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:35 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:36 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:36 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:36.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:37.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:37 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:38 np0005593294 systemd-logind[807]: New session 42 of user zuul.
Jan 23 04:57:38 np0005593294 systemd[1]: Started Session 42 of User zuul.
Jan 23 04:57:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:38 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:38 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:38.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:39 np0005593294 python3.9[103770]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:39.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:39 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:39 np0005593294 python3.9[103923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:40 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:40 np0005593294 python3.9[104001]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:40 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:40 np0005593294 systemd[1]: session-42.scope: Deactivated successfully.
Jan 23 04:57:40 np0005593294 systemd[1]: session-42.scope: Consumed 1.639s CPU time.
Jan 23 04:57:40 np0005593294 systemd-logind[807]: Session 42 logged out. Waiting for processes to exit.
Jan 23 04:57:40 np0005593294 systemd-logind[807]: Removed session 42.
Jan 23 04:57:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:40.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:41.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:41 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:42 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:42 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:42.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:43.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:43 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:44 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf80013a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:44 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:44.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:45.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:45 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:46 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:46 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf8001eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:46.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:47.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:47 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:47 np0005593294 systemd-logind[807]: New session 43 of user zuul.
Jan 23 04:57:47 np0005593294 systemd[1]: Started Session 43 of User zuul.
Jan 23 04:57:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:48 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:48 np0005593294 python3.9[104186]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:57:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:48 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:48.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:49.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:49 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf8001eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:49 np0005593294 python3.9[104343]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:50 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:50 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:50 np0005593294 python3.9[104518]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:50.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:51.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:51 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:51 np0005593294 python3.9[104596]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.u1_6i0p7 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.116387) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272116621, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1422, "num_deletes": 252, "total_data_size": 4154785, "memory_usage": 4203160, "flush_reason": "Manual Compaction"}
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272132891, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1767345, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10469, "largest_seqno": 11886, "table_properties": {"data_size": 1762718, "index_size": 2087, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11771, "raw_average_key_size": 20, "raw_value_size": 1752658, "raw_average_value_size": 2980, "num_data_blocks": 94, "num_entries": 588, "num_filter_entries": 588, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162146, "oldest_key_time": 1769162146, "file_creation_time": 1769162272, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 16686 microseconds, and 7444 cpu microseconds.
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.133137) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1767345 bytes OK
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.133212) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.134600) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.134625) EVENT_LOG_v1 {"time_micros": 1769162272134619, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.134653) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4148107, prev total WAL file size 4148107, number of live WAL files 2.
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.136272) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323533' seq:0, type:0; will stop at (end)
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1725KB)], [21(13MB)]
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272136402, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 15549399, "oldest_snapshot_seqno": -1}
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4245 keys, 13455552 bytes, temperature: kUnknown
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272225164, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 13455552, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13423230, "index_size": 20628, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 107951, "raw_average_key_size": 25, "raw_value_size": 13341601, "raw_average_value_size": 3142, "num_data_blocks": 884, "num_entries": 4245, "num_filter_entries": 4245, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162272, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.225570) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 13455552 bytes
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.227157) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.0 rd, 151.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.1 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(16.4) write-amplify(7.6) OK, records in: 4709, records dropped: 464 output_compression: NoCompression
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.227194) EVENT_LOG_v1 {"time_micros": 1769162272227178, "job": 10, "event": "compaction_finished", "compaction_time_micros": 88841, "compaction_time_cpu_micros": 36567, "output_level": 6, "num_output_files": 1, "total_output_size": 13455552, "num_input_records": 4709, "num_output_records": 4245, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272227960, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272232776, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.136120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232874) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232882) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232888) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:52 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf8002bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:52 np0005593294 python3.9[104749]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:52 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:52.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:53 np0005593294 python3.9[104827]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.js2dmu92 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:53.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:53 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:53 np0005593294 python3.9[104980]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:54 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:54 np0005593294 python3.9[105132]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:54 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf8002bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:54.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:55 np0005593294 python3.9[105210]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:55.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:55 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:55 np0005593294 python3.9[105363]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:56 np0005593294 python3.9[105441]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:56 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:56 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:56 np0005593294 python3.9[105593]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:56.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:57.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:57 np0005593294 python3.9[105746]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:57 np0005593294 python3.9[105849]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:58 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:58 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:58 np0005593294 python3.9[106001]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:57:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:58.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:57:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:57:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:59.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:59 np0005593294 python3.9[106079]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:59 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:58:00 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:00 np0005593294 python3.9[106232]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:58:00 np0005593294 systemd[1]: Reloading.
Jan 23 04:58:00 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:58:00 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:58:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:58:00 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:00.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:01.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:58:01 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:01 np0005593294 python3.9[106421]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:01 np0005593294 python3.9[106500]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:02 np0005593294 kernel: ganesha.nfsd[104028]: segfault at 50 ip 00007efd7eb8332e sp 00007efce37fd210 error 4 in libntirpc.so.5.8[7efd7eb68000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 23 04:58:02 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 04:58:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:58:02 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy ignored for local
Jan 23 04:58:02 np0005593294 systemd[1]: Started Process Core Dump (PID 106651/UID 0).
Jan 23 04:58:02 np0005593294 python3.9[106654]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:02.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:03 np0005593294 python3.9[106732]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:03.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:03 np0005593294 systemd-coredump[106653]: Process 99512 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 56:#012#0  0x00007efd7eb8332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007efd7eb8d900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 23 04:58:03 np0005593294 systemd[1]: systemd-coredump@2-106651-0.service: Deactivated successfully.
Jan 23 04:58:03 np0005593294 systemd[1]: systemd-coredump@2-106651-0.service: Consumed 1.293s CPU time.
Jan 23 04:58:03 np0005593294 podman[106890]: 2026-01-23 09:58:03.736439387 +0000 UTC m=+0.028238564 container died 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True)
Jan 23 04:58:03 np0005593294 python3.9[106885]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:58:03 np0005593294 systemd[1]: Reloading.
Jan 23 04:58:03 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:58:03 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:58:04 np0005593294 systemd[1]: Starting Create netns directory...
Jan 23 04:58:04 np0005593294 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:58:04 np0005593294 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:58:04 np0005593294 systemd[1]: Finished Create netns directory.
Jan 23 04:58:04 np0005593294 systemd[1]: var-lib-containers-storage-overlay-be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b-merged.mount: Deactivated successfully.
Jan 23 04:58:04 np0005593294 podman[106890]: 2026-01-23 09:58:04.752726933 +0000 UTC m=+1.044526080 container remove 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:58:04 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 04:58:04 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 04:58:04 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.644s CPU time.
Jan 23 04:58:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:58:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:04.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:58:05 np0005593294 python3.9[107120]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:58:05 np0005593294 network[107137]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:58:05 np0005593294 network[107138]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:58:05 np0005593294 network[107139]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:58:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:05.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:06.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:07.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095808 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:58:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:08.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:09.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:09 np0005593294 python3.9[107403]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:09 np0005593294 python3.9[107482]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:10 np0005593294 python3.9[107634]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:10.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:11.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:11 np0005593294 python3.9[107786]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:11 np0005593294 python3.9[107865]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:12.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:58:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:13.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:58:13 np0005593294 python3.9[108017]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 04:58:13 np0005593294 systemd[1]: Starting Time & Date Service...
Jan 23 04:58:13 np0005593294 systemd[1]: Started Time & Date Service.
Jan 23 04:58:14 np0005593294 python3.9[108174]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:14 np0005593294 python3.9[108327]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:14.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:15 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 3.
Jan 23 04:58:15 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:58:15 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.644s CPU time.
Jan 23 04:58:15 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:58:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:15.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:15 np0005593294 podman[108450]: 2026-01-23 09:58:15.340188416 +0000 UTC m=+0.075712184 container create a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 23 04:58:15 np0005593294 python3.9[108430]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:15 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 04:58:15 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:58:15 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:58:15 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:58:15 np0005593294 podman[108450]: 2026-01-23 09:58:15.310411347 +0000 UTC m=+0.045935175 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:58:15 np0005593294 podman[108450]: 2026-01-23 09:58:15.403210274 +0000 UTC m=+0.138734022 container init a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:58:15 np0005593294 podman[108450]: 2026-01-23 09:58:15.408261463 +0000 UTC m=+0.143785191 container start a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid)
Jan 23 04:58:15 np0005593294 bash[108450]: a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3
Jan 23 04:58:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 04:58:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 04:58:15 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:58:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 04:58:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 04:58:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 04:58:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 04:58:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 04:58:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:58:16 np0005593294 python3.9[108659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:16 np0005593294 python3.9[108737]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.t5sc_d7x recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:16.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:17.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:17 np0005593294 python3.9[108889]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:17 np0005593294 python3.9[108993]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:18 np0005593294 python3.9[109145]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:58:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:18.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:19.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:19 np0005593294 python3[109299]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:58:20 np0005593294 python3.9[109451]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:20.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:21 np0005593294 python3.9[109529]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:21.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:22 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:58:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:22 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:58:22 np0005593294 python3.9[109682]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:22 np0005593294 python3.9[109807]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162301.437065-895-253409366867912/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:22.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:23.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:23 np0005593294 python3.9[109960]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:24 np0005593294 python3.9[110038]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:24.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:24 np0005593294 python3.9[110190]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:25.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:25 np0005593294 python3.9[110268]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:26 np0005593294 python3.9[110421]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:26 np0005593294 python3.9[110499]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:26.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:27.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:27 np0005593294 python3.9[110651]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:58:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:28 np0005593294 python3.9[110807]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:58:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:28.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:29 np0005593294 python3.9[110975]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:29.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:29 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8001970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:29 np0005593294 python3.9[111193]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:30 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:30 np0005593294 python3.9[111362]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:58:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:30 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:30.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:58:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:31 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:58:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:58:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:31.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:58:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:31 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:31 np0005593294 python3.9[111514]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:58:31 np0005593294 systemd[1]: session-43.scope: Deactivated successfully.
Jan 23 04:58:31 np0005593294 systemd[1]: session-43.scope: Consumed 30.505s CPU time.
Jan 23 04:58:31 np0005593294 systemd-logind[807]: Session 43 logged out. Waiting for processes to exit.
Jan 23 04:58:31 np0005593294 systemd-logind[807]: Removed session 43.
Jan 23 04:58:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:32 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095832 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:58:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:32 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:33.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:33.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:33 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:34 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:34 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:35.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:35.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:35 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:36 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:36 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:36 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:36 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:37.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:37.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:37 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:37 np0005593294 systemd-logind[807]: New session 44 of user zuul.
Jan 23 04:58:37 np0005593294 systemd[1]: Started Session 44 of User zuul.
Jan 23 04:58:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:38 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:38 np0005593294 python3.9[111748]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 04:58:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:38 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:39.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:39.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:39 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:39 np0005593294 python3.9[111900]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:58:40 np0005593294 python3.9[112055]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 23 04:58:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:40 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:40 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:40 np0005593294 python3.9[112207]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.fvcogk_j follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:58:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:41.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:58:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:41 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:41.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:41 np0005593294 python3.9[112333]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.fvcogk_j mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162320.3643203-103-1280156936906/.source.fvcogk_j _original_basename=.1kcbc928 follow=False checksum=6c63675b4fda7e0d01c328fcbe34dc890491aeeb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:42 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:42 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:42 np0005593294 python3.9[112485]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:58:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:43.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:43 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:43.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:43 np0005593294 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 04:58:44 np0005593294 python3.9[112641]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+cj2so8SS29oYZ1K+7e02qi6fVkGXJzGMkIN9mgJPLCBtQ6vpBYEObTZZXuMIHhdiMUAp6RDjs11OXDkAB9R7e2ncjMKn7J2EHbmceT7rNq9L0w+QaLKFxl+xdJQ9QtO9ioNgJFXXQZt/IOeE8S4I5yhEM5jn+YEW0LPbp99Wz1d1Ob4GI1t0hCEv/4ayC3nRIXkuIhl7mrV0s22F8NE8f0hZZKaw1u8xmmpbD8ZVBsC6cxWE3kIQBmHu8q9tylaZjLsjGxBDUF9ko3bxeppvLPDMem89VLQCWbgmOHl5ZIPsyNglusTIBUp8uA7g+Agz1uMojClMHnsZl68WjbCAVcRA9y/UgXphGyEYZCUJMv8CjYKzxriyHALZl6YFSyC5ELlEAxL8fyTwtXhQ1+e/lI9Ak3n4suC6JyH0NQ27MPIf7riyUFJLw9lZaDerZOkvI7/Y2PfRvdfyZ57g/xgGeLY0Ch30SFVC04lNXIpsOWbLBOg0BMP9ZiciAYAF9Yc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIreWuVcekgp7kF5pU+4TIKLHZyhuqd4Ly312ExEA5EG#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJWfXOTsTXqDhdGhW7VcUXsYqCS7TzCPyaa9/dA9e0xKjnni1/GRM8FdYXWYbGsNnBQFWk3/pXD6sj3jKzK34AM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA/6JnQZ3CFC7xgv4DrvdZizVbVnsolKcWkvqzGu1hFHGmOEb7ehbxGPHBnp2N9iRf13H12EI0qNI6A2f44V0oXE3SP+fpJ6PVYQRQpKqTEiweqZaHEyYE2FnKy0HDQisg5hwr1egYLjGXChdkyqWSokL1LqaCyD2+EcOzUvC/GuVQ7eQnQBIGBpYAnNzS/64KKOZ0+0soOPJGxVCma6JN/2GcCunX6j3HmkOOQeuEFETXfUPHh1ylu2+3yINl34ERJN5YwgR/S+BKENOsJTu5XkYTCvc90CuvfkoF9K5Y2yE5nKwZaSf7n2SbUPil2Zph4l7opsd5IKxi6k2mVzw/CO2NHr136BZ06+sKXytDgorWqWzqnci8zfxeYF3D7q7AXD+IDVMP5T6op93oS2enAQFHG1vTLB0otQqnxUgNANbJkrKgXAS8G8I1m2sPz+qOFuuZa2/nqhzrd6/DEur5VoW6n9c/OcrbfapLEzD1jQDmsQI7oZkT++dt3Ogb3Vk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIII1sLqY7Nqi1A3CKXLokfn1vrns/lK1gUkDNSlbek2o#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9QZXHUsthFMKA5Si4Htl7MIwK0G4VAltQgbo39JJHrgD7h27U1jbnuJQ1S2bBX8FMSkqf5TPmM7Gr9QOATO+4=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWbrXZxuAw0n/xJmOvWW/Qbg53ya2CuJKzcHA+OvDpHLHGxkEuiUhwKvqUbfSTzn0o1M00OYITJIvZVINGRtQC7hGvBPWLVBON097mcmnju857I72U3dGdvGhnEUHyrglCV+xSkafQTTlnY9B59EKImUs/kiwRy3cYDWkCgthJgiPA4QSw6WrzaqpY2ET+7n+yY31EOagGA3ufW43qFbHX4diFuXpS1I1PLvvA4KINlMlsFcyR29j4nQk/vb5hMpLmBOlfVH16CXZC98a0ltp9ib7F3e1Wjdogj92kxwfQMYIeQEBp11Tc/PY5U90J51oyk8xYOKfsP3+r9yczmfRDjwR3+tzUMKyZYAsKQVcOGQC7x9sEXg3mBeXRVrlIVZFMuNVcYq4CY40fDIybcI25GxgRbQR7ZUWODG1SL7RF02Z+LQB6APXkzxdQUWLWPryj/EtOgnHQ1I0+BJTWrqGkKbSj41jhRTfS+MZvRXAJ+fNyZFhpkHo54DrCii4cbyM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGRPkwTcFVg/dIKRq29iWBfkoVFqIQ1pXOCPxfcGWRFF#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGf/hJ2dg/PRwojw63FLyKqua+ChKP+2bc7Eb0p70H6ve1elFVeY8lVRXx33JWc2m/XfgSWPNcUs9zBG8QcFVak=#012 create=True mode=0644 path=/tmp/ansible.fvcogk_j state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:44 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:44 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:44 np0005593294 python3.9[112793]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.fvcogk_j' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:58:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:45.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:45 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:45.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:45 np0005593294 python3.9[112948]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.fvcogk_j state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:46 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:46 np0005593294 systemd[1]: session-44.scope: Deactivated successfully.
Jan 23 04:58:46 np0005593294 systemd[1]: session-44.scope: Consumed 5.017s CPU time.
Jan 23 04:58:46 np0005593294 systemd-logind[807]: Session 44 logged out. Waiting for processes to exit.
Jan 23 04:58:46 np0005593294 systemd-logind[807]: Removed session 44.
Jan 23 04:58:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:46 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:47.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:47 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:47.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:58:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:49.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:58:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:49 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:49.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:50 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:50 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:51.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:51 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:51.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:52 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:52 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:53.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:53 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:53.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:53 np0005593294 systemd-logind[807]: New session 45 of user zuul.
Jan 23 04:58:53 np0005593294 systemd[1]: Started Session 45 of User zuul.
Jan 23 04:58:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:54 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:54 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:54 np0005593294 python3.9[113130]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:58:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:55.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:55 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:58:55 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:58:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:55 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:55.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:56 np0005593294 python3.9[113288]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 04:58:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:56 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:56 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:57.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:57 np0005593294 python3.9[113442]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:58:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095857 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:58:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:57 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:57.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:58 np0005593294 python3.9[113621]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:58:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:58 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:58 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:58 np0005593294 python3.9[113774]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:58:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:58:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:59.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:58:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:59 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:58:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:58:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:59.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:58:59 np0005593294 python3.9[113927]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:00 np0005593294 systemd[1]: session-45.scope: Deactivated successfully.
Jan 23 04:59:00 np0005593294 systemd[1]: session-45.scope: Consumed 3.955s CPU time.
Jan 23 04:59:00 np0005593294 systemd-logind[807]: Session 45 logged out. Waiting for processes to exit.
Jan 23 04:59:00 np0005593294 systemd-logind[807]: Removed session 45.
Jan 23 04:59:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:00 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:00 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:01.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:01 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd4001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:01.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:02 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:02 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:03.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:03 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:03.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:04 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd4001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:04 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:05.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:05 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:59:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:05.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:59:06 np0005593294 systemd-logind[807]: New session 46 of user zuul.
Jan 23 04:59:06 np0005593294 systemd[1]: Started Session 46 of User zuul.
Jan 23 04:59:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:06 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:06 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd40022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:07 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:59:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:07.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:07 np0005593294 python3.9[114109]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:59:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:07 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:07.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:08 np0005593294 python3.9[114266]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:59:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:08 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:08 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:09.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:09 np0005593294 python3.9[114350]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:59:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:09 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd40022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:09.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:59:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:59:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:59:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:11.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:11 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:11.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:11 np0005593294 python3.9[114503]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:59:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:12 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd40022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:12 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:13 np0005593294 python3.9[114654]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:59:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:13.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:13 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:13.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:13 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:59:13 np0005593294 python3.9[114805]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:59:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:14 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:14 np0005593294 python3.9[114955]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:59:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:14 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:15.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:15 np0005593294 systemd[1]: session-46.scope: Deactivated successfully.
Jan 23 04:59:15 np0005593294 systemd[1]: session-46.scope: Consumed 5.738s CPU time.
Jan 23 04:59:15 np0005593294 systemd-logind[807]: Session 46 logged out. Waiting for processes to exit.
Jan 23 04:59:15 np0005593294 systemd-logind[807]: Removed session 46.
Jan 23 04:59:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:15.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:16 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:16 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:17.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:17 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:17.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:18 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:18 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 04:59:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:19.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 04:59:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095919 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:59:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:19 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:19.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:20 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:20 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:21.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:21 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:21.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:22 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:22 np0005593294 systemd-logind[807]: New session 47 of user zuul.
Jan 23 04:59:22 np0005593294 systemd[1]: Started Session 47 of User zuul.
Jan 23 04:59:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:22 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:23.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:23 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:23.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:23 np0005593294 python3.9[115165]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:59:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:24 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:24 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:25.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:25 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:25.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:25 np0005593294 python3.9[115322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:25 np0005593294 python3.9[115475]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:26 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:26 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:26 np0005593294 python3.9[115627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:27.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:27 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:27.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:27 np0005593294 python3.9[115751]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162366.1367166-150-121561062680685/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=200902694b7ce68180eae274ebcbc81826cfce70 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:28 np0005593294 python3.9[115903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:28 np0005593294 python3.9[116026]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162367.7156394-150-201791179724440/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=ff17d6d1438a69ae92e7570d79b66fb807ae4885 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:59:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:29.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:59:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:29 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:29.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:29 np0005593294 python3.9[116178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:29 np0005593294 python3.9[116302]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162368.931849-150-129764174144882/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=663b16ef7007139396e1a67c87fa3c37816c2c66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:30 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.475126) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370475268, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1154, "num_deletes": 251, "total_data_size": 2885437, "memory_usage": 2912432, "flush_reason": "Manual Compaction"}
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 23 04:59:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:30 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370805650, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1881107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11891, "largest_seqno": 13040, "table_properties": {"data_size": 1876037, "index_size": 2594, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10506, "raw_average_key_size": 19, "raw_value_size": 1865914, "raw_average_value_size": 3386, "num_data_blocks": 116, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162273, "oldest_key_time": 1769162273, "file_creation_time": 1769162370, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 330716 microseconds, and 7290 cpu microseconds.
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.805872) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1881107 bytes OK
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.805944) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.812193) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.812317) EVENT_LOG_v1 {"time_micros": 1769162370812297, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.812361) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 2879868, prev total WAL file size 2879868, number of live WAL files 2.
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.814038) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1837KB)], [24(12MB)]
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370814207, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15336659, "oldest_snapshot_seqno": -1}
Jan 23 04:59:30 np0005593294 python3.9[116454]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4280 keys, 13223994 bytes, temperature: kUnknown
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370928940, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 13223994, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13192507, "index_size": 19665, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109447, "raw_average_key_size": 25, "raw_value_size": 13111325, "raw_average_value_size": 3063, "num_data_blocks": 828, "num_entries": 4280, "num_filter_entries": 4280, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162370, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.929192) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 13223994 bytes
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.930801) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.6 rd, 115.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 12.8 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(15.2) write-amplify(7.0) OK, records in: 4796, records dropped: 516 output_compression: NoCompression
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.930846) EVENT_LOG_v1 {"time_micros": 1769162370930829, "job": 12, "event": "compaction_finished", "compaction_time_micros": 114797, "compaction_time_cpu_micros": 52375, "output_level": 6, "num_output_files": 1, "total_output_size": 13223994, "num_input_records": 4796, "num_output_records": 4280, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370931321, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370933470, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.813730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933572) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933582) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933585) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:31.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:31 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:31.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:31 np0005593294 python3.9[116606]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:32 np0005593294 python3.9[116759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:32 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:32 np0005593294 python3.9[116882]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162371.7146757-329-194713526187765/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=f740c4b30b5527eca1229a1da8351348fcc44551 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:32 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:33.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:33 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:33.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:33 np0005593294 python3.9[117034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:33 np0005593294 python3.9[117158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162372.8842256-329-237368036976808/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=7ea5769d722c11e7459792c631f886a53fdd1360 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:34 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:34 np0005593294 python3.9[117310]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:34 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:35 np0005593294 python3.9[117433]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162374.035501-329-77376745714389/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=b196055dd1ec29a2d4bd394f7949ec509db05ad5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:59:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:35.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:59:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:35 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:35.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:35 np0005593294 python3.9[117586]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:36 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:36 np0005593294 python3.9[117738]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:36 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:37 np0005593294 python3.9[117961]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:37.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:37 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:59:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:37.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:59:37 np0005593294 python3.9[118085]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162376.5885682-495-92008184019561/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=a4c0461c7922277b00b636dd64d46b12688eb9b6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:38 np0005593294 python3.9[118314]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:38 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:38 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:38 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:38 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:38 np0005593294 python3.9[118466]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162377.7212057-495-19416349126539/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=7ea5769d722c11e7459792c631f886a53fdd1360 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:39.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095939 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:59:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:39 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:39.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:39 np0005593294 python3.9[118618]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:59:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:59:39 np0005593294 python3.9[118742]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162378.960454-495-72322725231718/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=9a1d9e1331601b6c16031f0051d14a3eadf04541 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:40 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:40 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:41.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:41 np0005593294 python3.9[118894]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:41 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:41.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:42 np0005593294 python3.9[119047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:42 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:42 np0005593294 python3.9[119170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162381.4740531-681-169956418797010/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:42 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:43.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:43 np0005593294 python3.9[119322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:43 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:43.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:44 np0005593294 python3.9[119475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:44 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:44 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:44 np0005593294 python3.9[119598]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162383.4424393-762-250779937840994/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:45.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:45 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:45.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:45 np0005593294 python3.9[119751]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:46 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:46 np0005593294 python3.9[119903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:46 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:47 np0005593294 python3.9[120028]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162385.9573383-840-251399452380848/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:47.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:47 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:47.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:47 np0005593294 python3.9[120181]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:48 np0005593294 python3.9[120333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:59:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:49 np0005593294 python3.9[120458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162387.9418838-910-166972745957666/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:49.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:49 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002690 fd 48 proxy ignored for local
Jan 23 04:59:49 np0005593294 kernel: ganesha.nfsd[120334]: segfault at 50 ip 00007f5c62ec932e sp 00007f5bc6ffc210 error 4 in libntirpc.so.5.8[7f5c62eae000+2c000] likely on CPU 7 (core 0, socket 7)
Jan 23 04:59:49 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 04:59:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:49.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:49 np0005593294 systemd[1]: Started Process Core Dump (PID 120519/UID 0).
Jan 23 04:59:49 np0005593294 python3.9[120624]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:50 np0005593294 python3.9[120790]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:51 np0005593294 systemd-coredump[120532]: Process 108470 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007f5c62ec932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 04:59:51 np0005593294 python3.9[120913]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162389.9972572-982-109459922440105/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:51 np0005593294 systemd[1]: systemd-coredump@3-120519-0.service: Deactivated successfully.
Jan 23 04:59:51 np0005593294 systemd[1]: systemd-coredump@3-120519-0.service: Consumed 1.658s CPU time.
Jan 23 04:59:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:51.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:51 np0005593294 podman[120918]: 2026-01-23 09:59:51.193207725 +0000 UTC m=+0.032825898 container died a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 23 04:59:51 np0005593294 systemd[1]: var-lib-containers-storage-overlay-521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913-merged.mount: Deactivated successfully.
Jan 23 04:59:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:51.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:51 np0005593294 python3.9[121086]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:59:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 604.3 total, 600.0 interval#012Cumulative writes: 8279 writes, 33K keys, 8279 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 8279 writes, 1593 syncs, 5.20 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8279 writes, 33K keys, 8279 commit groups, 1.0 writes per commit group, ingest: 21.31 MB, 0.04 MB/s#012Interval WAL: 8279 writes, 1593 syncs, 5.20 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 604.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 604.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 604.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Jan 23 04:59:52 np0005593294 podman[120918]: 2026-01-23 09:59:52.010458359 +0000 UTC m=+0.850076512 container remove a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:59:52 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 04:59:52 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 04:59:52 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.907s CPU time.
Jan 23 04:59:52 np0005593294 python3.9[121266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:53 np0005593294 python3.9[121389]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162391.9776301-1052-42453237798948/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:53.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:53.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:55.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:55.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:55 np0005593294 systemd[1]: session-47.scope: Deactivated successfully.
Jan 23 04:59:55 np0005593294 systemd[1]: session-47.scope: Consumed 23.336s CPU time.
Jan 23 04:59:55 np0005593294 systemd-logind[807]: Session 47 logged out. Waiting for processes to exit.
Jan 23 04:59:55 np0005593294 systemd-logind[807]: Removed session 47.
Jan 23 04:59:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095956 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:59:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:57.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:57.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:59:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:59.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:59:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 04:59:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:59.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:00 np0005593294 ceph-mon[80126]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:00:00 np0005593294 ceph-mon[80126]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:00:00 np0005593294 ceph-mon[80126]:     osd.1 observed slow operation indications in BlueStore
Jan 23 05:00:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:01.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100001 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:00:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:01.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:02 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 4.
Jan 23 05:00:02 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:00:02 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.907s CPU time.
Jan 23 05:00:02 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:00:02 np0005593294 systemd-logind[807]: New session 48 of user zuul.
Jan 23 05:00:02 np0005593294 systemd[1]: Started Session 48 of User zuul.
Jan 23 05:00:02 np0005593294 podman[121547]: 2026-01-23 10:00:02.543877641 +0000 UTC m=+0.043287046 container create 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True)
Jan 23 05:00:02 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:00:02 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:00:02 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:00:02 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:00:02 np0005593294 podman[121547]: 2026-01-23 10:00:02.608323159 +0000 UTC m=+0.107732564 container init 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True)
Jan 23 05:00:02 np0005593294 podman[121547]: 2026-01-23 10:00:02.523147662 +0000 UTC m=+0.022557077 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:00:02 np0005593294 podman[121547]: 2026-01-23 10:00:02.620740487 +0000 UTC m=+0.120149872 container start 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:00:02 np0005593294 bash[121547]: 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141
Jan 23 05:00:02 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:00:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:00:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:00:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:00:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:00:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:00:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:00:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:00:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:00:03 np0005593294 python3.9[121703]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:03.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:03.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:04 np0005593294 python3.9[121856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:04 np0005593294 python3.9[121979]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162403.3364599-58-95983260235889/.source.conf _original_basename=ceph.conf follow=False checksum=c8d90d44a83782ff84a3d797d09c3b204e2d1c61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:05.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:05 np0005593294 python3.9[122131]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:05.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:05 np0005593294 python3.9[122255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162404.8678617-58-255395524335640/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=a6273c4bda164a032598e5e81cbd7f6e9c0876d5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:06 np0005593294 systemd[1]: session-48.scope: Deactivated successfully.
Jan 23 05:00:06 np0005593294 systemd[1]: session-48.scope: Consumed 2.587s CPU time.
Jan 23 05:00:06 np0005593294 systemd-logind[807]: Session 48 logged out. Waiting for processes to exit.
Jan 23 05:00:06 np0005593294 systemd-logind[807]: Removed session 48.
Jan 23 05:00:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:07.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:07.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:08 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:00:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:08 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:00:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:09.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:09.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:11.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:11.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:12 np0005593294 systemd-logind[807]: New session 49 of user zuul.
Jan 23 05:00:12 np0005593294 systemd[1]: Started Session 49 of User zuul.
Jan 23 05:00:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:13.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:13.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:13 np0005593294 python3.9[122437]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:00:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:00:15 np0005593294 python3.9[122607]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:15.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:15 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd448000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:15.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:15 np0005593294 python3.9[122761]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:16 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:16 np0005593294 python3.9[122911]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:00:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:16 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:17.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:17 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd448000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:17.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:17 np0005593294 python3.9[123064]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 05:00:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100018 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:00:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:18 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:18 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:19.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:19 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:19.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:20 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:20 np0005593294 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 23 05:00:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:20 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:20 np0005593294 python3.9[123247]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 05:00:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:21.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:21 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:21.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:21 np0005593294 python3.9[123332]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:00:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:22 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:22 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:23.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:23 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:23.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:24 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:24 np0005593294 python3.9[123486]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:00:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:24 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:25.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:25 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:25.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:25 np0005593294 python3[123642]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 23 05:00:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:26 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:26 np0005593294 python3.9[123794]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:26 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:27.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:27 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:27.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:27 np0005593294 python3.9[123946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:27 np0005593294 python3.9[124025]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:28 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:28 np0005593294 python3.9[124177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:28 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:29 np0005593294 python3.9[124255]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.lbe55_d8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:29.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:29 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:29.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:30 np0005593294 python3.9[124408]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:30 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:30 np0005593294 python3.9[124486]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:30 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:31.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:31 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:31.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:31 np0005593294 python3.9[124639]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:32 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:32 np0005593294 python3[124792]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 05:00:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:32 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:33.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:33 np0005593294 python3.9[124944]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:33 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:33.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:33 np0005593294 python3.9[125070]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162432.7276552-427-116407109939417/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:34 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:34 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:35 np0005593294 python3.9[125222]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 05:00:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:35.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 05:00:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:35 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:35.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:35 np0005593294 python3.9[125348]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162434.2599027-472-142941938339080/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:36 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:36 np0005593294 python3.9[125500]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:36 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:37 np0005593294 python3.9[125625]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162435.9345121-517-246391593233414/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:37.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:37 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:37.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:37 np0005593294 python3.9[125778]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:38 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:38 np0005593294 python3.9[125928]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162437.4209535-562-204396651707816/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:38 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:00:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:39.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:00:39 np0005593294 python3.9[126080]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:39 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:39.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:39 np0005593294 python3.9[126206]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162438.7094913-607-232059713954712/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:40 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:40 np0005593294 python3.9[126358]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:40 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:41.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:41 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:41 np0005593294 python3.9[126510]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:41.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:42 np0005593294 python3.9[126666]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:42 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:42 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:43 np0005593294 python3.9[126818]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:43.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:43 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:43.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:44 np0005593294 python3.9[126972]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:00:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:44 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:44 np0005593294 python3.9[127126]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:44 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:45.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:45 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:45 np0005593294 python3.9[127281]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:45.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:45 np0005593294 ceph-osd[77616]: bluestore.MempoolThread fragmentation_score=0.000033 took=0.000313s
Jan 23 05:00:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:46 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:46 np0005593294 python3.9[127432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:00:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:46 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:47.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:47 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c0013a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:00:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2156 writes, 13K keys, 2156 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2156 writes, 2156 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2156 writes, 13K keys, 2156 commit groups, 1.0 writes per commit group, ingest: 36.35 MB, 0.06 MB/s#012Interval WAL: 2156 writes, 2156 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     38.9      0.53              0.06         6    0.088       0      0       0.0       0.0#012  L6      1/0   12.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.0    125.0    110.6      0.55              0.18         5    0.111     22K   2298       0.0       0.0#012 Sum      1/0   12.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0     64.0     75.6      1.08              0.25        11    0.099     22K   2298       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0     64.1     75.8      1.08              0.25        10    0.108     22K   2298       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    125.0    110.6      0.55              0.18         5    0.111     22K   2298       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     39.1      0.53              0.06         5    0.105       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.020, interval 0.020#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.1 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 1.47 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000161 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(86,1.26 MB,0.414803%) FilterBlock(11,73.42 KB,0.0235859%) IndexBlock(11,142.14 KB,0.0456609%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:00:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:47.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:48 np0005593294 python3.9[127588]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:48 np0005593294 ovs-vsctl[127589]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 23 05:00:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:48 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:48 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:49 np0005593294 python3.9[127741]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:49.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:49 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:49.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:49 np0005593294 python3.9[127897]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:49 np0005593294 ovs-vsctl[127899]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 23 05:00:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:50 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:50 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:51.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:51 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:51.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:52 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:52 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:53.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:53 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:53.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:53 np0005593294 ceph-mds[84630]: mds.beacon.cephfs.compute-1.bcvzvj missed beacon ack from the monitors
Jan 23 05:00:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:54 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:54 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:55.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:55 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:55.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:55 np0005593294 python3.9[128131]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:00:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:56 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:00:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:00:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:00:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:00:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:00:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:00:56 np0005593294 python3.9[128285]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:56 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:57.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:57 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:57.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:57 np0005593294 python3.9[128438]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:58 np0005593294 python3.9[128516]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:58 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:58 np0005593294 python3.9[128693]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:58 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:59 np0005593294 python3.9[128771]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:59.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:59 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:00:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:59.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:59 np0005593294 python3.9[128924]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:00 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:00 np0005593294 python3.9[129076]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:00 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:01 np0005593294 python3.9[129154]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:01.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:01 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:01.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:02 np0005593294 python3.9[129318]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:02 np0005593294 python3.9[129396]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:03.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:03 np0005593294 python3.9[129548]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:01:03 np0005593294 systemd[1]: Reloading.
Jan 23 05:01:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:03 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:03.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:03 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:03 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:04 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:04 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:04 np0005593294 python3.9[129765]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:05.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:05 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:05.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:01:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:01:05 np0005593294 python3.9[129843]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:06 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:06 np0005593294 python3.9[129996]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:06 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:07 np0005593294 python3.9[130074]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:07.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:07 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:07.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:07 np0005593294 python3.9[130227]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:01:07 np0005593294 systemd[1]: Reloading.
Jan 23 05:01:08 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:08 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:08 np0005593294 systemd[1]: Starting Create netns directory...
Jan 23 05:01:08 np0005593294 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 05:01:08 np0005593294 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 05:01:08 np0005593294 systemd[1]: Finished Create netns directory.
Jan 23 05:01:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:08 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:08 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:09.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:09 np0005593294 python3.9[130421]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:09 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:09.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:10 np0005593294 python3.9[130574]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:10 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:10 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:11 np0005593294 python3.9[130698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162469.891899-1360-200251991842722/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:11.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:11 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd424000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:11.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:12 np0005593294 python3.9[130851]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:12 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:12 np0005593294 python3.9[131003]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:12 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:13.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:13 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:13.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:13 np0005593294 python3.9[131156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:14 np0005593294 python3.9[131279]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162473.1772597-1459-219667465065434/.source.json _original_basename=.3iy7p8yy follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:14 np0005593294 python3.9[131429]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:15.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:15 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:15.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:16 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:16 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:17.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:17 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:17.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:17 np0005593294 python3.9[131853]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 23 05:01:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:18 np0005593294 kernel: ganesha.nfsd[122702]: segfault at 50 ip 00007fd4d441432e sp 00007fd437ffe210 error 4 in libntirpc.so.5.8[7fd4d43f9000+2c000] likely on CPU 3 (core 0, socket 3)
Jan 23 05:01:18 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:01:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:18 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy ignored for local
Jan 23 05:01:18 np0005593294 systemd[1]: Started Process Core Dump (PID 132031/UID 0).
Jan 23 05:01:18 np0005593294 python3.9[132032]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 05:01:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:19.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:19.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:19 np0005593294 systemd-coredump[132033]: Process 121566 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 54:#012#0  0x00007fd4d441432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007fd4d441e900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 23 05:01:19 np0005593294 systemd[1]: systemd-coredump@4-132031-0.service: Deactivated successfully.
Jan 23 05:01:19 np0005593294 systemd[1]: systemd-coredump@4-132031-0.service: Consumed 1.256s CPU time.
Jan 23 05:01:19 np0005593294 podman[132151]: 2026-01-23 10:01:19.876807411 +0000 UTC m=+0.052085632 container died 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Jan 23 05:01:19 np0005593294 systemd[1]: var-lib-containers-storage-overlay-916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f-merged.mount: Deactivated successfully.
Jan 23 05:01:19 np0005593294 podman[132151]: 2026-01-23 10:01:19.920207198 +0000 UTC m=+0.095485399 container remove 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Jan 23 05:01:19 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:01:20 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 05:01:20 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.798s CPU time.
Jan 23 05:01:20 np0005593294 python3[132201]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 05:01:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:21.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:21.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:23.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:23.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100124 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:01:25 np0005593294 podman[132247]: 2026-01-23 10:01:25.049347082 +0000 UTC m=+4.817359095 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 05:01:25 np0005593294 podman[132371]: 2026-01-23 10:01:25.211817725 +0000 UTC m=+0.058488966 container create 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:01:25 np0005593294 podman[132371]: 2026-01-23 10:01:25.179732147 +0000 UTC m=+0.026403468 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 05:01:25 np0005593294 python3[132201]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 05:01:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:25.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:25.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:27.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:27.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:29.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:29.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:29 np0005593294 python3.9[132564]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:01:30 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 5.
Jan 23 05:01:30 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:01:30 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.798s CPU time.
Jan 23 05:01:30 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:01:30 np0005593294 podman[132765]: 2026-01-23 10:01:30.470653972 +0000 UTC m=+0.064323512 container create 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 23 05:01:30 np0005593294 python3.9[132732]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:30 np0005593294 podman[132765]: 2026-01-23 10:01:30.435531968 +0000 UTC m=+0.029201508 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:01:30 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:01:30 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:01:30 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:01:30 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:01:30 np0005593294 podman[132765]: 2026-01-23 10:01:30.551416024 +0000 UTC m=+0.145085564 container init 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:01:30 np0005593294 podman[132765]: 2026-01-23 10:01:30.558134756 +0000 UTC m=+0.151804266 container start 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Jan 23 05:01:30 np0005593294 bash[132765]: 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc
Jan 23 05:01:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:01:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:01:30 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:01:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:01:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:01:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:01:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:01:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:01:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:01:30 np0005593294 python3.9[132897]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:01:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:31.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:31.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:31 np0005593294 python3.9[133049]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769162490.983635-1693-194799596292884/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:32 np0005593294 python3.9[133125]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:01:32 np0005593294 systemd[1]: Reloading.
Jan 23 05:01:32 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:32 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:33 np0005593294 python3.9[133235]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:01:33 np0005593294 systemd[1]: Reloading.
Jan 23 05:01:33 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:33 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:33.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:33 np0005593294 systemd[1]: Starting ovn_controller container...
Jan 23 05:01:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:33.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:33 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:01:33 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43da7c3110500b2184bcac9d202c790e74899d463bae53ae641d21dda7a79896/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 23 05:01:33 np0005593294 systemd[1]: Started /usr/bin/podman healthcheck run 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162.
Jan 23 05:01:33 np0005593294 podman[133278]: 2026-01-23 10:01:33.622424859 +0000 UTC m=+0.136637926 container init 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:01:33 np0005593294 ovn_controller[133293]: + sudo -E kolla_set_configs
Jan 23 05:01:33 np0005593294 podman[133278]: 2026-01-23 10:01:33.650281122 +0000 UTC m=+0.164494179 container start 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:01:33 np0005593294 systemd[1]: Created slice User Slice of UID 0.
Jan 23 05:01:33 np0005593294 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 23 05:01:33 np0005593294 edpm-start-podman-container[133278]: ovn_controller
Jan 23 05:01:33 np0005593294 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 23 05:01:33 np0005593294 systemd[1]: Starting User Manager for UID 0...
Jan 23 05:01:33 np0005593294 edpm-start-podman-container[133277]: Creating additional drop-in dependency for "ovn_controller" (56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162)
Jan 23 05:01:33 np0005593294 systemd[1]: Reloading.
Jan 23 05:01:33 np0005593294 podman[133300]: 2026-01-23 10:01:33.783458366 +0000 UTC m=+0.123675004 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 05:01:33 np0005593294 systemd[133317]: Queued start job for default target Main User Target.
Jan 23 05:01:33 np0005593294 systemd[133317]: Created slice User Application Slice.
Jan 23 05:01:33 np0005593294 systemd[133317]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 23 05:01:33 np0005593294 systemd[133317]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 05:01:33 np0005593294 systemd[133317]: Reached target Paths.
Jan 23 05:01:33 np0005593294 systemd[133317]: Reached target Timers.
Jan 23 05:01:33 np0005593294 systemd[133317]: Starting D-Bus User Message Bus Socket...
Jan 23 05:01:33 np0005593294 systemd[133317]: Starting Create User's Volatile Files and Directories...
Jan 23 05:01:33 np0005593294 systemd[133317]: Finished Create User's Volatile Files and Directories.
Jan 23 05:01:33 np0005593294 systemd[133317]: Listening on D-Bus User Message Bus Socket.
Jan 23 05:01:33 np0005593294 systemd[133317]: Reached target Sockets.
Jan 23 05:01:33 np0005593294 systemd[133317]: Reached target Basic System.
Jan 23 05:01:33 np0005593294 systemd[133317]: Reached target Main User Target.
Jan 23 05:01:33 np0005593294 systemd[133317]: Startup finished in 135ms.
Jan 23 05:01:33 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:33 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:34 np0005593294 systemd[1]: Started User Manager for UID 0.
Jan 23 05:01:34 np0005593294 systemd[1]: Started ovn_controller container.
Jan 23 05:01:34 np0005593294 systemd[1]: 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162-7ccd2d264f17c79c.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 05:01:34 np0005593294 systemd[1]: 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162-7ccd2d264f17c79c.service: Failed with result 'exit-code'.
Jan 23 05:01:34 np0005593294 systemd[1]: Started Session c1 of User root.
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: INFO:__main__:Validating config file
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: INFO:__main__:Writing out command to execute
Jan 23 05:01:34 np0005593294 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: ++ cat /run_command
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: + ARGS=
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: + sudo kolla_copy_cacerts
Jan 23 05:01:34 np0005593294 systemd[1]: Started Session c2 of User root.
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: + [[ ! -n '' ]]
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: + . kolla_extend_start
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: + umask 0022
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 23 05:01:34 np0005593294 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 23 05:01:34 np0005593294 NetworkManager[48978]: <info>  [1769162494.2257] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 23 05:01:34 np0005593294 NetworkManager[48978]: <info>  [1769162494.2264] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 05:01:34 np0005593294 NetworkManager[48978]: <warn>  [1769162494.2267] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 05:01:34 np0005593294 NetworkManager[48978]: <info>  [1769162494.2274] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 23 05:01:34 np0005593294 NetworkManager[48978]: <info>  [1769162494.2279] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 23 05:01:34 np0005593294 NetworkManager[48978]: <info>  [1769162494.2282] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 05:01:34 np0005593294 kernel: br-int: entered promiscuous mode
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 23 05:01:34 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:34Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 23 05:01:34 np0005593294 systemd-udevd[133424]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:01:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:35.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:35.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:35 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:35Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 05:01:35 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:35Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 05:01:35 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:35Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 05:01:35 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:35Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 05:01:35 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:35Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 05:01:35 np0005593294 ovn_controller[133293]: 2026-01-23T10:01:35Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 05:01:35 np0005593294 NetworkManager[48978]: <info>  [1769162495.6945] manager: (ovn-eb059b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 23 05:01:35 np0005593294 NetworkManager[48978]: <info>  [1769162495.6999] manager: (ovn-57e418-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 23 05:01:35 np0005593294 systemd-udevd[133426]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:01:35 np0005593294 kernel: genev_sys_6081: entered promiscuous mode
Jan 23 05:01:35 np0005593294 NetworkManager[48978]: <info>  [1769162495.7152] device (genev_sys_6081): carrier: link connected
Jan 23 05:01:35 np0005593294 NetworkManager[48978]: <info>  [1769162495.7157] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Jan 23 05:01:36 np0005593294 NetworkManager[48978]: <info>  [1769162496.1838] manager: (ovn-8fb585-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 23 05:01:37 np0005593294 python3.9[133557]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 05:01:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:37.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:37 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:01:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:37 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:01:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:37.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:38 np0005593294 python3.9[133710]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:38 np0005593294 python3.9[133858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162497.756444-1828-93450145547982/.source.yaml _original_basename=.9dol0os0 follow=False checksum=a80724acad465d51ee59522dfe4a3a5c05876d7d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:01:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:39.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:01:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:39.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:39 np0005593294 python3.9[134011]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:01:39 np0005593294 ovs-vsctl[134012]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 23 05:01:40 np0005593294 python3.9[134164]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:01:40 np0005593294 ovs-vsctl[134166]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 23 05:01:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:41.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:41.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:41 np0005593294 python3.9[134320]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:01:41 np0005593294 ovs-vsctl[134321]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 23 05:01:42 np0005593294 systemd[1]: session-49.scope: Deactivated successfully.
Jan 23 05:01:42 np0005593294 systemd[1]: session-49.scope: Consumed 59.553s CPU time.
Jan 23 05:01:42 np0005593294 systemd-logind[807]: Session 49 logged out. Waiting for processes to exit.
Jan 23 05:01:42 np0005593294 systemd-logind[807]: Removed session 49.
Jan 23 05:01:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:43.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:43.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:01:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:01:44 np0005593294 systemd[1]: Stopping User Manager for UID 0...
Jan 23 05:01:44 np0005593294 systemd[133317]: Activating special unit Exit the Session...
Jan 23 05:01:44 np0005593294 systemd[133317]: Stopped target Main User Target.
Jan 23 05:01:44 np0005593294 systemd[133317]: Stopped target Basic System.
Jan 23 05:01:44 np0005593294 systemd[133317]: Stopped target Paths.
Jan 23 05:01:44 np0005593294 systemd[133317]: Stopped target Sockets.
Jan 23 05:01:44 np0005593294 systemd[133317]: Stopped target Timers.
Jan 23 05:01:44 np0005593294 systemd[133317]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 05:01:44 np0005593294 systemd[133317]: Closed D-Bus User Message Bus Socket.
Jan 23 05:01:44 np0005593294 systemd[133317]: Stopped Create User's Volatile Files and Directories.
Jan 23 05:01:44 np0005593294 systemd[133317]: Removed slice User Application Slice.
Jan 23 05:01:44 np0005593294 systemd[133317]: Reached target Shutdown.
Jan 23 05:01:44 np0005593294 systemd[133317]: Finished Exit the Session.
Jan 23 05:01:44 np0005593294 systemd[133317]: Reached target Exit the Session.
Jan 23 05:01:44 np0005593294 systemd[1]: user@0.service: Deactivated successfully.
Jan 23 05:01:44 np0005593294 systemd[1]: Stopped User Manager for UID 0.
Jan 23 05:01:44 np0005593294 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 23 05:01:44 np0005593294 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 23 05:01:44 np0005593294 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 23 05:01:44 np0005593294 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 23 05:01:44 np0005593294 systemd[1]: Removed slice User Slice of UID 0.
Jan 23 05:01:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:44 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9360000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:44 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:45.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:45 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:45.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100146 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:01:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:46 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:46 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:47.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:47 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:47.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:48 np0005593294 systemd-logind[807]: New session 51 of user zuul.
Jan 23 05:01:48 np0005593294 systemd[1]: Started Session 51 of User zuul.
Jan 23 05:01:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:48 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:48 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:49 np0005593294 python3.9[134520]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:01:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:49.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:49 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:01:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:49.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:01:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:50 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93640023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:50 np0005593294 python3.9[134677]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:50 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:51 np0005593294 python3.9[134829]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:51.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:51 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:51.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:52 np0005593294 python3.9[134982]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:52 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:52 np0005593294 python3.9[135135]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:52 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:01:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:53.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:01:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:53 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:53.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:53 np0005593294 python3.9[135287]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:54 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:54 np0005593294 python3.9[135438]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:01:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:54 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:55.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:55 np0005593294 python3.9[135590]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 05:01:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:55 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:55.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:56 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:56 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:56 np0005593294 python3.9[135741]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:57.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:57 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:57.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:57 np0005593294 python3.9[135863]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162516.3145523-214-124497351997352/.source follow=False _original_basename=haproxy.j2 checksum=1daf285be4abb25cbd7ba376734de140aac9aefe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:58 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:58 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:59 np0005593294 python3.9[136043]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:59.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:59 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:01:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:59.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:59 np0005593294 python3.9[136166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162518.60924-259-248567993364976/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:00 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:00 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:01 np0005593294 python3.9[136318]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 05:02:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:01.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:01 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:01.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:01 np0005593294 python3.9[136405]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:02:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:02 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:02 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100203 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:02:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:03.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:03 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:03.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:04 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:02:04Z|00025|memory|INFO|16000 kB peak resident set size after 30.5 seconds
Jan 23 05:02:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:02:04Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 23 05:02:04 np0005593294 podman[136508]: 2026-01-23 10:02:04.720809817 +0000 UTC m=+0.113958898 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 05:02:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:04 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:05 np0005593294 python3.9[136586]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:02:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:05.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:05 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:05.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:02:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:02:05 np0005593294 python3.9[136823]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:06 np0005593294 python3.9[136944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162525.3221169-370-137297292832377/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:06 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:06 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:06 np0005593294 python3.9[137094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:07 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:02:07 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:02:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:07.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:07 np0005593294 python3.9[137215]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162526.517756-370-15219577684183/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:07 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:07.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:08 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:08 np0005593294 python3.9[137366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:08 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:09 np0005593294 python3.9[137487]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162528.3279688-502-172779167216476/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:09.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:09 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:09.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:09 np0005593294 python3.9[137638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:10 np0005593294 python3.9[137759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162529.431357-502-35243825754108/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:10 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:10 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:11 np0005593294 python3.9[137909]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:02:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:11.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:11 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:02:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:11.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:02:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:11 : epoch 697346fa : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:02:12 np0005593294 python3.9[138064]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:12 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:12 np0005593294 python3.9[138216]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:12 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:13 np0005593294 python3.9[138294]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:13.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:13 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:13 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:02:13 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:02:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:13.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:13 np0005593294 python3.9[138472]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:14 np0005593294 python3.9[138550]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:14 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:14 : epoch 697346fa : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:02:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:14 : epoch 697346fa : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:02:14 np0005593294 python3.9[138702]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:14 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:15.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:15 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:15.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:15 np0005593294 python3.9[138855]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:16 np0005593294 python3.9[138934]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:16 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:16 np0005593294 python3.9[139087]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:16 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9350000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:17 np0005593294 python3.9[139165]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:17.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:17 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9340000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:17.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:17 : epoch 697346fa : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:02:18 np0005593294 python3.9[139318]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:02:18 np0005593294 systemd[1]: Reloading.
Jan 23 05:02:18 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:18 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:18 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:18 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:19 np0005593294 python3.9[139533]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:19.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:19 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9350001aa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:02:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:19.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:02:19 np0005593294 python3.9[139612]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:20 np0005593294 python3.9[139764]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:20 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:20 np0005593294 python3.9[139842]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:20 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:21.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:21 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:21.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:21 np0005593294 python3.9[139995]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:02:21 np0005593294 systemd[1]: Reloading.
Jan 23 05:02:21 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:21 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:22 np0005593294 systemd[1]: Starting Create netns directory...
Jan 23 05:02:22 np0005593294 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 05:02:22 np0005593294 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 05:02:22 np0005593294 systemd[1]: Finished Create netns directory.
Jan 23 05:02:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:22 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9350001aa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:22 np0005593294 python3.9[140189]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:22 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100223 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:02:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:23.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:23 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:23.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:23 np0005593294 python3.9[140342]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:24 np0005593294 python3.9[140465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162543.319567-955-14467157977317/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:24 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:24 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93500027b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:25 np0005593294 python3.9[140617]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:25.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:25 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93500027b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:25.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:26 np0005593294 python3.9[140770]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:26 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:26 np0005593294 python3.9[140922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:26 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:27 np0005593294 python3.9[141045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162546.3477795-1054-214909290537178/.source.json _original_basename=.xs2hfnlj follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:27.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:27 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:02:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:27.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:02:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:28 np0005593294 python3.9[141196]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:28 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:28 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:29.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:29 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9340002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:29.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:30 np0005593294 python3.9[141620]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 23 05:02:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9340002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:31.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:31 np0005593294 python3.9[141772]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 05:02:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:31 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:31.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:32 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy ignored for local
Jan 23 05:02:32 np0005593294 kernel: ganesha.nfsd[134363]: segfault at 50 ip 00007f93eb60432e sp 00007f9354ff8210 error 4 in libntirpc.so.5.8[7f93eb5e9000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 23 05:02:32 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:02:32 np0005593294 systemd[1]: Started Process Core Dump (PID 141926/UID 0).
Jan 23 05:02:32 np0005593294 python3[141925]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 05:02:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:33.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:33.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:33 np0005593294 systemd-coredump[141927]: Process 132803 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 53:#012#0  0x00007f93eb60432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 05:02:33 np0005593294 systemd[1]: systemd-coredump@5-141926-0.service: Deactivated successfully.
Jan 23 05:02:33 np0005593294 systemd[1]: systemd-coredump@5-141926-0.service: Consumed 1.136s CPU time.
Jan 23 05:02:33 np0005593294 podman[141979]: 2026-01-23 10:02:33.993170276 +0000 UTC m=+0.034869697 container died 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 05:02:34 np0005593294 systemd[1]: var-lib-containers-storage-overlay-d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832-merged.mount: Deactivated successfully.
Jan 23 05:02:34 np0005593294 podman[141979]: 2026-01-23 10:02:34.094314093 +0000 UTC m=+0.136013494 container remove 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 05:02:34 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:02:34 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 05:02:34 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.263s CPU time.
Jan 23 05:02:34 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 05:02:34 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 05:02:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:02:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:35.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:02:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:35.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:36 np0005593294 podman[142034]: 2026-01-23 10:02:36.12566918 +0000 UTC m=+0.523642641 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 05:02:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:37.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:37.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100238 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:02:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:39.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:02:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:39.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:02:41 np0005593294 podman[141940]: 2026-01-23 10:02:41.193248263 +0000 UTC m=+8.505194511 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:02:41 np0005593294 podman[142168]: 2026-01-23 10:02:41.352787499 +0000 UTC m=+0.041784698 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:02:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:41.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:41.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:41 np0005593294 podman[142168]: 2026-01-23 10:02:41.721097278 +0000 UTC m=+0.410094377 container create e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 23 05:02:41 np0005593294 python3[141925]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:02:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:43.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 05:02:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:43.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 05:02:44 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 6.
Jan 23 05:02:44 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:02:44 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.263s CPU time.
Jan 23 05:02:44 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:02:44 np0005593294 podman[142279]: 2026-01-23 10:02:44.538409547 +0000 UTC m=+0.041488000 container create 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS)
Jan 23 05:02:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:44 np0005593294 podman[142279]: 2026-01-23 10:02:44.591959897 +0000 UTC m=+0.095038390 container init 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Jan 23 05:02:44 np0005593294 podman[142279]: 2026-01-23 10:02:44.596505467 +0000 UTC m=+0.099583920 container start 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Jan 23 05:02:44 np0005593294 bash[142279]: 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb
Jan 23 05:02:44 np0005593294 podman[142279]: 2026-01-23 10:02:44.52141597 +0000 UTC m=+0.024494463 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:02:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:02:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:02:44 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:02:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:02:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:02:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:02:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:02:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:02:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:02:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:45.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:45.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:47.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:47.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:48 np0005593294 python3.9[142466]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:02:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:49.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:49.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:49 np0005593294 python3.9[142621]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:50 np0005593294 python3.9[142697]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:02:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:02:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:02:51 np0005593294 python3.9[142848]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769162570.24057-1288-43555596506973/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:51.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:51 np0005593294 python3.9[142924]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:02:51 np0005593294 systemd[1]: Reloading.
Jan 23 05:02:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:51.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:51 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:51 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:52 np0005593294 python3.9[143036]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:02:52 np0005593294 systemd[1]: Reloading.
Jan 23 05:02:52 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:52 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:52 np0005593294 systemd[1]: Starting ovn_metadata_agent container...
Jan 23 05:02:52 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:02:53 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebdebd315409c6b64b2dfe6f1c70aa65fc33d875c067e0175d97367d40cd3030/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:53 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebdebd315409c6b64b2dfe6f1c70aa65fc33d875c067e0175d97367d40cd3030/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:53 np0005593294 systemd[1]: Started /usr/bin/podman healthcheck run e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6.
Jan 23 05:02:53 np0005593294 podman[143077]: 2026-01-23 10:02:53.045373287 +0000 UTC m=+0.150750584 container init e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: + sudo -E kolla_set_configs
Jan 23 05:02:53 np0005593294 podman[143077]: 2026-01-23 10:02:53.075532983 +0000 UTC m=+0.180910300 container start e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:02:53 np0005593294 edpm-start-podman-container[143077]: ovn_metadata_agent
Jan 23 05:02:53 np0005593294 podman[143099]: 2026-01-23 10:02:53.13540116 +0000 UTC m=+0.050113710 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:02:53 np0005593294 edpm-start-podman-container[143076]: Creating additional drop-in dependency for "ovn_metadata_agent" (e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6)
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Validating config file
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Copying service configuration files
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Writing out command to execute
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: ++ cat /run_command
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: + CMD=neutron-ovn-metadata-agent
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: + ARGS=
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: + sudo kolla_copy_cacerts
Jan 23 05:02:53 np0005593294 systemd[1]: Reloading.
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: + [[ ! -n '' ]]
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: + . kolla_extend_start
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: Running command: 'neutron-ovn-metadata-agent'
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: + umask 0022
Jan 23 05:02:53 np0005593294 ovn_metadata_agent[143093]: + exec neutron-ovn-metadata-agent
Jan 23 05:02:53 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:53 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:53 np0005593294 systemd[1]: Started ovn_metadata_agent container.
Jan 23 05:02:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:53.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:53.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.993 143098 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.993 143098 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.993 143098 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.027 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.027 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.027 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.027 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.036 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.036 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.036 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.037 143098 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.037 143098 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.056 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 170ec811-bf2b-4b3a-9339-50a49c79a1e6 (UUID: 170ec811-bf2b-4b3a-9339-50a49c79a1e6) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.079 143098 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.080 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.080 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.080 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.083 143098 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.088 143098 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.095 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '170ec811-bf2b-4b3a-9339-50a49c79a1e6'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], external_ids={}, name=170ec811-bf2b-4b3a-9339-50a49c79a1e6, nb_cfg_timestamp=1769162502251, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.096 143098 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f31a2f53f40>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.097 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.097 143098 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.097 143098 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.098 143098 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.102 143098 DEBUG oslo_service.service [-] Started child 143210 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.106 143098 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmplrjtqy92/privsep.sock']#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.107 143210 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-1999580'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.129 143210 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.129 143210 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.129 143210 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.132 143210 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.140 143210 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.146 143210 INFO eventlet.wsgi.server [-] (143210) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 23 05:02:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:55.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:55.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:55 np0005593294 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.813 143098 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.814 143098 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmplrjtqy92/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.675 143216 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.680 143216 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.683 143216 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.684 143216 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143216#033[00m
Jan 23 05:02:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.817 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[97a1aa72-315d-4cf5-82dd-aa959237d05f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.335 143216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.335 143216 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.335 143216 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.880 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[f14283d7-5ad3-4da1-a385-14b4d45ef7b4]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.882 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, column=external_ids, values=({'neutron:ovn-metadata-id': '49259f78-9be2-5e1c-94bb-1c5d5138e24a'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.896 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.903 143098 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.903 143098 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.941 143098 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.941 143098 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.941 143098 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 05:02:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000015s ======
Jan 23 05:02:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:57.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000015s
Jan 23 05:02:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:57.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:02:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:02:58 np0005593294 python3.9[143361]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 05:02:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:59.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:02:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000015s ======
Jan 23 05:02:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:59.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000015s
Jan 23 05:03:00 np0005593294 python3.9[143539]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:03:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:00 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:00 np0005593294 python3.9[143664]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162579.658673-1423-8608494354108/.source.yaml _original_basename=.i0sf62_b follow=False checksum=d88282ad6bcd11f7bd2cbc3f4703eb6122d6b05d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:01 np0005593294 systemd[1]: session-51.scope: Deactivated successfully.
Jan 23 05:03:01 np0005593294 systemd[1]: session-51.scope: Consumed 56.527s CPU time.
Jan 23 05:03:01 np0005593294 systemd-logind[807]: Session 51 logged out. Waiting for processes to exit.
Jan 23 05:03:01 np0005593294 systemd-logind[807]: Removed session 51.
Jan 23 05:03:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:01.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:01.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100302 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:03:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:02 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 05:03:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:03.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 05:03:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000015s ======
Jan 23 05:03:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:03.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000015s
Jan 23 05:03:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:04 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 05:03:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:05.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 05:03:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 05:03:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:05.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 05:03:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 05:03:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:07.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 05:03:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:07.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:08 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c002d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:09.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:09.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:09 np0005593294 podman[143694]: 2026-01-23 10:03:09.756355299 +0000 UTC m=+0.147780020 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:03:10 np0005593294 systemd-logind[807]: New session 52 of user zuul.
Jan 23 05:03:10 np0005593294 systemd[1]: Started Session 52 of User zuul.
Jan 23 05:03:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:10 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c002d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:11 np0005593294 python3.9[143875]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:03:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:11.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034008f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:11.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:12 np0005593294 python3.9[144033]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:12 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:13.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:13.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:13 np0005593294 python3.9[144247]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:03:13 np0005593294 systemd[1]: Reloading.
Jan 23 05:03:14 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:03:14 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:03:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:14 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:15 np0005593294 python3.9[144467]: ansible-ansible.builtin.service_facts Invoked
Jan 23 05:03:15 np0005593294 network[144485]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 05:03:15 np0005593294 network[144486]: 'network-scripts' will be removed from distribution in near future.
Jan 23 05:03:15 np0005593294 network[144487]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 05:03:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:15.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:15.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:16 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:03:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:03:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:03:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:03:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:17.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:17.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:18 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:19.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:19.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:20 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:21 np0005593294 python3.9[144776]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:21.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:21.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:22 np0005593294 python3.9[144930]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:22 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:23 np0005593294 python3.9[145083]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:23.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:23 np0005593294 podman[145209]: 2026-01-23 10:03:23.622785847 +0000 UTC m=+0.084185946 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:03:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:23.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:23 np0005593294 python3.9[145254]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:24 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:24 np0005593294 python3.9[145409]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:25 np0005593294 python3.9[145562]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:03:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:25.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:03:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:25.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:26 np0005593294 python3.9[145716]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:26 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:27 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:03:27 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:03:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:27.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:27.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:27 np0005593294 python3.9[145895]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:28 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:28 np0005593294 python3.9[146047]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:29 np0005593294 python3.9[146199]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:29.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:29.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:30 np0005593294 python3.9[146354]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:30 np0005593294 python3.9[146506]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:31.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:31 np0005593294 python3.9[146659]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:31.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:32 np0005593294 python3.9[146811]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:32 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:32 np0005593294 python3.9[146963]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:03:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:33.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:03:33 np0005593294 python3.9[147116]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:33.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:34 np0005593294 python3.9[147268]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:34 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:34 np0005593294 python3.9[147420]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:35 np0005593294 python3.9[147572]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:35.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:35.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:36 np0005593294 python3.9[147725]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:36 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:36 np0005593294 python3.9[147877]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:37.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:37 np0005593294 python3.9[148030]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:37.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:38 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:38 np0005593294 python3.9[148182]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 05:03:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:39.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:39 np0005593294 python3.9[148360]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:03:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:39.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:39 np0005593294 systemd[1]: Reloading.
Jan 23 05:03:39 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:03:39 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:03:39 np0005593294 podman[148362]: 2026-01-23 10:03:39.932606284 +0000 UTC m=+0.111219943 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 23 05:03:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:40 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:40 np0005593294 python3.9[148574]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:41.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:41 np0005593294 python3.9[148728]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:41.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:42 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:43 np0005593294 python3.9[148881]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:43.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:43.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:43 np0005593294 python3.9[149035]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:44 np0005593294 python3.9[149188]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:45 np0005593294 python3.9[149341]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:45.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:45.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:46 np0005593294 python3.9[149495]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:46 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:47 np0005593294 python3.9[149648]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 23 05:03:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:47.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:47.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:48 np0005593294 python3.9[149802]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 05:03:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:48 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:49 np0005593294 python3.9[149960]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 05:03:49 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:03:49 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:03:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:49.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:03:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:49.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:03:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:50 np0005593294 python3.9[150122]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 05:03:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:03:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:51.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:03:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:51 np0005593294 python3.9[150207]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:03:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:51.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:52 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:53.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:53.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100354 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:03:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:54 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:54 np0005593294 podman[150216]: 2026-01-23 10:03:54.685781587 +0000 UTC m=+0.079505828 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 23 05:03:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:03:55.029 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:03:55.030 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:03:55.031 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:03:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:55.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:03:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:55.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:56 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:57 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:57.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:57 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:57.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100359 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:03:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:59.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:03:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:59.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:00 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:01.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:01.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:02 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:04:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:03.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:03.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:04 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:05.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:05.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:04:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:04:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:07.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:07.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:08 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:04:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:09.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:10 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:10 np0005593294 podman[150449]: 2026-01-23 10:04:10.773885111 +0000 UTC m=+0.164532380 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 23 05:04:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:11.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:11.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:12 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:04:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:12 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:13.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:13.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:14 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:04:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:04:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:15.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:15.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100416 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:04:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:16 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:17.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:17.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:18 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:04:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:18 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:18 np0005593294 kernel: SELinux:  Converting 2779 SID table entries...
Jan 23 05:04:18 np0005593294 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 05:04:18 np0005593294 kernel: SELinux:  policy capability open_perms=1
Jan 23 05:04:18 np0005593294 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 05:04:18 np0005593294 kernel: SELinux:  policy capability always_check_network=0
Jan 23 05:04:18 np0005593294 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 05:04:18 np0005593294 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 05:04:18 np0005593294 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 05:04:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:19 np0005593294 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 23 05:04:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:04:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:19.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:04:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:04:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:19.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:04:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:20 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100421 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:04:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:21.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:21.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:22 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:23.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:23.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:24 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:04:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:25.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:04:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:25 np0005593294 podman[150518]: 2026-01-23 10:04:25.742343689 +0000 UTC m=+0.100987185 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:04:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:25.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:26 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:27.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:27.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:28 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:29 np0005593294 kernel: SELinux:  Converting 2779 SID table entries...
Jan 23 05:04:29 np0005593294 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 05:04:29 np0005593294 kernel: SELinux:  policy capability open_perms=1
Jan 23 05:04:29 np0005593294 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 05:04:29 np0005593294 kernel: SELinux:  policy capability always_check_network=0
Jan 23 05:04:29 np0005593294 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 05:04:29 np0005593294 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 05:04:29 np0005593294 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 05:04:29 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:29 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:04:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:29.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:04:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:29.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:04:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:04:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:31.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:31.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:32 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340093d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.003000097s ======
Jan 23 05:04:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:33.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000097s
Jan 23 05:04:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:33.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:34 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:04:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:35.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:04:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340093d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:35 np0005593294 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 23 05:04:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:35.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:36 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:36 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:36 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:37.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:37.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:38 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340093d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:39.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000065s ======
Jan 23 05:04:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:39.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Jan 23 05:04:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:40 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340093d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:41.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:41 np0005593294 podman[150684]: 2026-01-23 10:04:41.712785226 +0000 UTC m=+0.104185878 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 23 05:04:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:41.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:42 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:43.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:43.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:04:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:45.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:04:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:45.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:46 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:47.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:47.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:48 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:49.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:49.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:51.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:51.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:52 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:53.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:53.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:54 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:04:55.030 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:04:55.031 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:04:55.031 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:55.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:55.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:56 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003ed0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:56 np0005593294 podman[158885]: 2026-01-23 10:04:56.668430653 +0000 UTC m=+0.070349665 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:04:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:57 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:57.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:57 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:57.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:59.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:04:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:59.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:00 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:01.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:01.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:02 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:03.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:03.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:04 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80380013a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:05.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:05:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:05.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:05:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f800c000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:07.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:07.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:08 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:09.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:09.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:10 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f800c001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:11.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:05:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:11.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:05:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:12 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:12 np0005593294 podman[167485]: 2026-01-23 10:05:12.707724302 +0000 UTC m=+0.105029295 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:05:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:13.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:13.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:14 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.003000097s ======
Jan 23 05:05:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:15.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000097s
Jan 23 05:05:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:15.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:16 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:17.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:17.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100518 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:05:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:18 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:19.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:19.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:20 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:21.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:21.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:22 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038004440 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:23.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:23.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:24 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038004440 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:05:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:25.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:05:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:25.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:26 np0005593294 kernel: SELinux:  Converting 2780 SID table entries...
Jan 23 05:05:26 np0005593294 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 05:05:26 np0005593294 kernel: SELinux:  policy capability open_perms=1
Jan 23 05:05:26 np0005593294 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 05:05:26 np0005593294 kernel: SELinux:  policy capability always_check_network=0
Jan 23 05:05:26 np0005593294 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 05:05:26 np0005593294 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 05:05:26 np0005593294 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 05:05:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:26 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:26 np0005593294 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 23 05:05:26 np0005593294 podman[167710]: 2026-01-23 10:05:26.947191901 +0000 UTC m=+0.062311076 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:05:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:26 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:05:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:27 np0005593294 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 05:05:27 np0005593294 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 05:05:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:27.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:27.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:28 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038004440 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:29.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:29.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:05:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:05:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:31.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:31.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:32 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:05:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f800c0028c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:33.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:33.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:34 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:35 np0005593294 systemd[1]: Stopping OpenSSH server daemon...
Jan 23 05:05:35 np0005593294 systemd[1]: sshd.service: Deactivated successfully.
Jan 23 05:05:35 np0005593294 systemd[1]: Stopped OpenSSH server daemon.
Jan 23 05:05:35 np0005593294 systemd[1]: sshd.service: Consumed 3.108s CPU time, read 32.0K from disk, written 80.0K to disk.
Jan 23 05:05:35 np0005593294 systemd[1]: Stopped target sshd-keygen.target.
Jan 23 05:05:35 np0005593294 systemd[1]: Stopping sshd-keygen.target...
Jan 23 05:05:35 np0005593294 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 05:05:35 np0005593294 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 05:05:35 np0005593294 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 05:05:35 np0005593294 systemd[1]: Reached target sshd-keygen.target.
Jan 23 05:05:35 np0005593294 systemd[1]: Starting OpenSSH server daemon...
Jan 23 05:05:35 np0005593294 systemd[1]: Started OpenSSH server daemon.
Jan 23 05:05:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:35.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:05:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:35.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:05:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:36 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:36 np0005593294 podman[168873]: 2026-01-23 10:05:36.729640513 +0000 UTC m=+0.067186342 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 05:05:36 np0005593294 podman[168873]: 2026-01-23 10:05:36.832918731 +0000 UTC m=+0.170464560 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 05:05:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:37 np0005593294 podman[169050]: 2026-01-23 10:05:37.292685156 +0000 UTC m=+0.064757655 container exec 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:05:37 np0005593294 podman[169050]: 2026-01-23 10:05:37.308832413 +0000 UTC m=+0.080904862 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:05:37 np0005593294 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 05:05:37 np0005593294 systemd[1]: Starting man-db-cache-update.service...
Jan 23 05:05:37 np0005593294 systemd[1]: Reloading.
Jan 23 05:05:37 np0005593294 podman[169179]: 2026-01-23 10:05:37.728562796 +0000 UTC m=+0.077536674 container exec 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:05:37 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:37 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:37.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:37 np0005593294 podman[169179]: 2026-01-23 10:05:37.778899088 +0000 UTC m=+0.127872906 container exec_died 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:05:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:37 np0005593294 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 05:05:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:37.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:38 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:39.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:05:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:39.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:05:40 np0005593294 podman[169524]: 2026-01-23 10:05:40.04484279 +0000 UTC m=+1.955055936 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 05:05:40 np0005593294 podman[169524]: 2026-01-23 10:05:40.053860979 +0000 UTC m=+1.964074155 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 05:05:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100540 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:05:40 np0005593294 podman[172103]: 2026-01-23 10:05:40.307545373 +0000 UTC m=+0.056639645 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, com.redhat.component=keepalived-container, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=keepalived, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=)
Jan 23 05:05:40 np0005593294 podman[172103]: 2026-01-23 10:05:40.323827625 +0000 UTC m=+0.072921847 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vcs-type=git, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793)
Jan 23 05:05:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:40 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:05:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:05:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:05:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:41.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:05:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:41.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:42 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:43 np0005593294 podman[175760]: 2026-01-23 10:05:43.678224256 +0000 UTC m=+0.085773777 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 05:05:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:05:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:43.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:05:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:43.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:45.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:45 np0005593294 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 05:05:45 np0005593294 systemd[1]: Finished man-db-cache-update.service.
Jan 23 05:05:45 np0005593294 systemd[1]: man-db-cache-update.service: Consumed 10.558s CPU time.
Jan 23 05:05:45 np0005593294 systemd[1]: run-r69442ff605504ec99d0d39e9e17d0e5a.service: Deactivated successfully.
Jan 23 05:05:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:45.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:46 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:46 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:46 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:47.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:48.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:48 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:49.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:50.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:05:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:51.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:05:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:52.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:52 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:52 np0005593294 python3.9[178050]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:05:53 np0005593294 systemd[1]: Reloading.
Jan 23 05:05:53 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:53 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:53.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:54.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:54 np0005593294 python3.9[178241]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:05:54 np0005593294 systemd[1]: Reloading.
Jan 23 05:05:54 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:54 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:54 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:05:55.030 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:05:55.031 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:05:55.032 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:55 np0005593294 python3.9[178431]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:05:55 np0005593294 systemd[1]: Reloading.
Jan 23 05:05:55 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:55 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:55.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:05:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:56.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:05:56 np0005593294 python3.9[178623]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:05:56 np0005593294 systemd[1]: Reloading.
Jan 23 05:05:56 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:56 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:56 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy ignored for local
Jan 23 05:05:56 np0005593294 kernel: ganesha.nfsd[143339]: segfault at 50 ip 00007f80bffe732e sp 00007f80427fb210 error 4 in libntirpc.so.5.8[7f80bffcc000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 05:05:56 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:05:56 np0005593294 systemd[1]: Started Process Core Dump (PID 178660/UID 0).
Jan 23 05:05:57 np0005593294 podman[178786]: 2026-01-23 10:05:57.427129599 +0000 UTC m=+0.067893584 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:05:57 np0005593294 python3.9[178834]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:05:57 np0005593294 systemd[1]: Reloading.
Jan 23 05:05:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:57.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:57 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:57 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:57 np0005593294 systemd-coredump[178662]: Process 142298 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007f80bffe732e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 05:05:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:58 np0005593294 podman[178875]: 2026-01-23 10:05:58.075266481 +0000 UTC m=+0.026826830 container died 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 23 05:05:58 np0005593294 podman[178875]: 2026-01-23 10:05:58.114890478 +0000 UTC m=+0.066450807 container remove 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 05:05:58 np0005593294 systemd[1]: var-lib-containers-storage-overlay-a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005-merged.mount: Deactivated successfully.
Jan 23 05:05:58 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:05:58 np0005593294 systemd[1]: systemd-coredump@6-178660-0.service: Deactivated successfully.
Jan 23 05:05:58 np0005593294 systemd[1]: systemd-coredump@6-178660-0.service: Consumed 1.094s CPU time.
Jan 23 05:05:58 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 05:05:58 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.656s CPU time.
Jan 23 05:05:58 np0005593294 python3.9[179071]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:05:59 np0005593294 systemd[1]: Reloading.
Jan 23 05:05:59 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:59 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:05:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:59.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:00.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:00 np0005593294 python3.9[179288]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:00 np0005593294 systemd[1]: Reloading.
Jan 23 05:06:00 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:06:00 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:06:01 np0005593294 python3.9[179478]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:01.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:02.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:02 np0005593294 python3.9[179634]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:02 np0005593294 systemd[1]: Reloading.
Jan 23 05:06:02 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:06:02 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:06:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100602 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:06:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:03 np0005593294 python3.9[179824]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:06:03 np0005593294 systemd[1]: Reloading.
Jan 23 05:06:03 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:06:03 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:06:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:03.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:03 np0005593294 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 23 05:06:03 np0005593294 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 23 05:06:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:04.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:04 np0005593294 python3.9[180018]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:05 np0005593294 python3.9[180173]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:05.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:06.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:06 np0005593294 python3.9[180329]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:07.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:08.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:08 np0005593294 python3.9[180485]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:08 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 7.
Jan 23 05:06:08 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:06:08 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.656s CPU time.
Jan 23 05:06:08 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:06:08 np0005593294 podman[180607]: 2026-01-23 10:06:08.593704612 +0000 UTC m=+0.081602292 container create 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:06:08 np0005593294 podman[180607]: 2026-01-23 10:06:08.539388644 +0000 UTC m=+0.027286344 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:06:08 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:08 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:08 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:08 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:08 np0005593294 podman[180607]: 2026-01-23 10:06:08.667840385 +0000 UTC m=+0.155738095 container init 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:06:08 np0005593294 podman[180607]: 2026-01-23 10:06:08.674542779 +0000 UTC m=+0.162440449 container start 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 05:06:08 np0005593294 bash[180607]: 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da
Jan 23 05:06:08 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:06:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:06:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:06:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:06:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:06:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:06:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:06:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:06:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:06:09 np0005593294 python3.9[180718]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:09 np0005593294 python3.9[180896]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:09.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:10.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:10 np0005593294 python3.9[181051]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:11 np0005593294 python3.9[181206]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:11.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:12.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:12 np0005593294 python3.9[181362]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:13 np0005593294 python3.9[181517]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:13.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:13 np0005593294 podman[181645]: 2026-01-23 10:06:13.987668612 +0000 UTC m=+0.133168634 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:06:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:14.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:14 np0005593294 python3.9[181693]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:14 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:06:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:14 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:06:15 np0005593294 python3.9[181854]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:15.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:16.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:17 np0005593294 python3.9[182010]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:17.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:17 np0005593294 python3.9[182166]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:18.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:18 np0005593294 python3.9[182321]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:19 np0005593294 python3.9[182474]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:19.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:20.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:20 np0005593294 python3.9[182651]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:06:20 np0005593294 python3.9[182803]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:06:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:06:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:21 np0005593294 python3.9[182972]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:21.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:22.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:22 np0005593294 python3.9[183124]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:22 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:23 np0005593294 python3.9[183274]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:06:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:23.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:24.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:24 np0005593294 python3.9[183427]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100624 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:06:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:24 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:24 np0005593294 python3.9[183552]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162783.3902836-1642-68642680155090/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:25 np0005593294 python3.9[183704]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:25.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.880189) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785880315, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4222, "num_deletes": 502, "total_data_size": 11737855, "memory_usage": 11936008, "flush_reason": "Manual Compaction"}
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785914772, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4407138, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13045, "largest_seqno": 17262, "table_properties": {"data_size": 4395820, "index_size": 6404, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3845, "raw_key_size": 30531, "raw_average_key_size": 19, "raw_value_size": 4368991, "raw_average_value_size": 2857, "num_data_blocks": 279, "num_entries": 1529, "num_filter_entries": 1529, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162371, "oldest_key_time": 1769162371, "file_creation_time": 1769162785, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 34630 microseconds, and 10807 cpu microseconds.
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.914863) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4407138 bytes OK
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.914904) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.916406) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.916468) EVENT_LOG_v1 {"time_micros": 1769162785916458, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.916491) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11719031, prev total WAL file size 11719031, number of live WAL files 2.
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.919216) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323532' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4303KB)], [27(12MB)]
Jan 23 05:06:25 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785919321, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 17631132, "oldest_snapshot_seqno": -1}
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4981 keys, 13174592 bytes, temperature: kUnknown
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786018953, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13174592, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13139517, "index_size": 21525, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 124878, "raw_average_key_size": 25, "raw_value_size": 13047188, "raw_average_value_size": 2619, "num_data_blocks": 899, "num_entries": 4981, "num_filter_entries": 4981, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162785, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.019217) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13174592 bytes
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.020349) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.9 rd, 132.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 12.6 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(7.0) write-amplify(3.0) OK, records in: 5809, records dropped: 828 output_compression: NoCompression
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.020373) EVENT_LOG_v1 {"time_micros": 1769162786020360, "job": 14, "event": "compaction_finished", "compaction_time_micros": 99694, "compaction_time_cpu_micros": 30705, "output_level": 6, "num_output_files": 1, "total_output_size": 13174592, "num_input_records": 5809, "num_output_records": 4981, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786021311, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786023752, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.919058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023929) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023933) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:26.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:26 np0005593294 python3.9[183830]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162784.9180737-1642-213915064466572/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:26 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:26 np0005593294 python3.9[183982]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:27 np0005593294 python3.9[184107]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162786.3138676-1642-13326962495522/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:27 np0005593294 podman[184156]: 2026-01-23 10:06:27.653238979 +0000 UTC m=+0.051949213 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:06:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a140089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:27.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:28 np0005593294 python3.9[184279]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:28.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:28 np0005593294 python3.9[184404]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162787.5738187-1642-235121199264350/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:28 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:29 np0005593294 python3.9[184556]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:29.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:30.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:30 np0005593294 python3.9[184682]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162788.7690985-1642-16349019001806/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:30 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a140089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:31 np0005593294 python3.9[184834]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:31 np0005593294 python3.9[184962]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162790.5594811-1642-38285098709651/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:31.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000065s ======
Jan 23 05:06:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:32.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Jan 23 05:06:32 np0005593294 python3.9[185114]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:32 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:33 np0005593294 python3.9[185237]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162791.9281554-1642-112199368998397/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a140096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:33 np0005593294 python3.9[185390]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:33.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:34.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:34 np0005593294 python3.9[185515]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162793.2360907-1642-8608816156380/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:34 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:35 np0005593294 python3.9[185667]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 23 05:06:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a140096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:35.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:36 np0005593294 python3.9[185821]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:36.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:36 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc0032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:36 np0005593294 python3.9[185973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:37 np0005593294 python3.9[186125]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:37.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:38.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:38 np0005593294 python3.9[186278]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:38 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:38 np0005593294 python3.9[186430]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc0032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:39 np0005593294 python3.9[186582]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:39.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:40.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:40 np0005593294 python3.9[186740]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:40 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:40 np0005593294 python3.9[186912]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:41 np0005593294 python3.9[187064]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:41.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:42 np0005593294 python3.9[187217]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:42.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:42 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:42 np0005593294 python3.9[187369]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:43 np0005593294 python3.9[187522]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:43.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:44.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:44 np0005593294 podman[187646]: 2026-01-23 10:06:44.215900651 +0000 UTC m=+0.091543880 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 05:06:44 np0005593294 python3.9[187692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:44 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:45 np0005593294 python3.9[187852]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:45 np0005593294 python3.9[188005]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:45.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:46.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:46 np0005593294 python3.9[188136]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162805.293492-2305-269383149811741/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:46 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:46 np0005593294 python3.9[188362]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:47 np0005593294 python3.9[188485]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162806.4529254-2305-211999890843799/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:47.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:48.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:48 np0005593294 python3.9[188638]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:48 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:48 np0005593294 python3.9[188761]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162807.6888402-2305-95273635115613/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:49 np0005593294 python3.9[188913]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100649 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:06:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:49 np0005593294 python3.9[189037]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162808.8653703-2305-36663104291640/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:06:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:49.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:06:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:50.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:50 np0005593294 python3.9[189189]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:50 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:06:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:50 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:06:51 np0005593294 python3.9[189312]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162810.051862-2305-281331431970054/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:51 np0005593294 python3.9[189465]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:51.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:52.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100652 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:06:52 np0005593294 python3.9[189588]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162811.3475583-2305-227451832088885/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:52 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:52 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:53 np0005593294 python3.9[189742]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:53 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:53 np0005593294 python3.9[189866]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162812.599231-2305-173728037810526/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:53 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:53.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:54.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:54 np0005593294 python3.9[190018]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:54 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:54 np0005593294 python3.9[190141]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162813.8691988-2305-172959392001028/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:06:55.032 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:06:55.033 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:06:55.033 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:55 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:55 np0005593294 python3.9[190294]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:55 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:55.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:56 np0005593294 python3.9[190417]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162815.1196125-2305-232542808964308/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:56.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:56 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:56 np0005593294 python3.9[190594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:57 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:57 np0005593294 python3.9[190717]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162816.33545-2305-12300031237612/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:57 np0005593294 podman[190842]: 2026-01-23 10:06:57.774632371 +0000 UTC m=+0.065502208 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 05:06:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:57 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:57.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:57 np0005593294 python3.9[190890]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:58.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:58 np0005593294 python3.9[191013]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162817.4661543-2305-20698561615825/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:58 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:59 np0005593294 python3.9[191165]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:06:59 np0005593294 python3.9[191289]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162818.7391837-2305-99990440193089/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:06:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:59.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:00.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:00 np0005593294 python3.9[191466]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:00 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:01 np0005593294 python3.9[191589]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162819.9215534-2305-269813669787520/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:01 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:01 np0005593294 python3.9[191742]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:01 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:01.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:02.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:02 np0005593294 python3.9[191865]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162821.2720916-2305-80890308419736/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:02 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:07:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:02 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:07:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:02 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:03 np0005593294 python3.9[192015]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:03 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:03 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:03.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:04 np0005593294 python3.9[192171]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 23 05:07:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:04.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:04 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:04 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:07:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:07:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:05.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:07:06 np0005593294 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 23 05:07:06 np0005593294 python3.9[192328]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:06.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:06 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:07 np0005593294 python3.9[192480]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:07:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:07.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:08 np0005593294 python3.9[192633]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:07:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:08.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:07:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:08 np0005593294 python3.9[192785]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:09 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:09 np0005593294 python3.9[192937]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:09 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:09.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:10.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:10 np0005593294 python3.9[193090]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:10 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:10 np0005593294 python3.9[193242]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:11 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:11 np0005593294 python3.9[193394]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100711 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:07:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:11 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:11.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:12 np0005593294 python3.9[193547]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:12.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:12 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:12 np0005593294 python3.9[193699]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.835200) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832835278, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 707, "num_deletes": 251, "total_data_size": 1525072, "memory_usage": 1546200, "flush_reason": "Manual Compaction"}
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832843985, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 986082, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17267, "largest_seqno": 17969, "table_properties": {"data_size": 982539, "index_size": 1387, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7958, "raw_average_key_size": 19, "raw_value_size": 975535, "raw_average_value_size": 2373, "num_data_blocks": 61, "num_entries": 411, "num_filter_entries": 411, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162786, "oldest_key_time": 1769162786, "file_creation_time": 1769162832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 8850 microseconds, and 4469 cpu microseconds.
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.844060) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 986082 bytes OK
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.844086) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847798) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847827) EVENT_LOG_v1 {"time_micros": 1769162832847819, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847850) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1521240, prev total WAL file size 1521240, number of live WAL files 2.
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.848867) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(962KB)], [30(12MB)]
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832848920, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14160674, "oldest_snapshot_seqno": -1}
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4877 keys, 11786079 bytes, temperature: kUnknown
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832929632, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11786079, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11752713, "index_size": 20072, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123348, "raw_average_key_size": 25, "raw_value_size": 11663198, "raw_average_value_size": 2391, "num_data_blocks": 835, "num_entries": 4877, "num_filter_entries": 4877, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.929872) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11786079 bytes
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.931549) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.3 rd, 145.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.6 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(26.3) write-amplify(12.0) OK, records in: 5392, records dropped: 515 output_compression: NoCompression
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.931567) EVENT_LOG_v1 {"time_micros": 1769162832931559, "job": 16, "event": "compaction_finished", "compaction_time_micros": 80778, "compaction_time_cpu_micros": 40950, "output_level": 6, "num_output_files": 1, "total_output_size": 11786079, "num_input_records": 5392, "num_output_records": 4877, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832931817, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832933971, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.848709) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934014) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:13 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:13 np0005593294 python3.9[193851]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:13 np0005593294 systemd[1]: Reloading.
Jan 23 05:07:13 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:13 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:13 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:13.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:14 np0005593294 systemd[1]: Starting libvirt logging daemon socket...
Jan 23 05:07:14 np0005593294 systemd[1]: Listening on libvirt logging daemon socket.
Jan 23 05:07:14 np0005593294 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 23 05:07:14 np0005593294 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 23 05:07:14 np0005593294 systemd[1]: Starting libvirt logging daemon...
Jan 23 05:07:14 np0005593294 systemd[1]: Started libvirt logging daemon.
Jan 23 05:07:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:14.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100714 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:07:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:14 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:14 np0005593294 podman[194017]: 2026-01-23 10:07:14.704479712 +0000 UTC m=+0.100443515 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:07:14 np0005593294 python3.9[194067]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:14 np0005593294 systemd[1]: Reloading.
Jan 23 05:07:15 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:15 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:15 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:15 np0005593294 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 23 05:07:15 np0005593294 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 23 05:07:15 np0005593294 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 23 05:07:15 np0005593294 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 23 05:07:15 np0005593294 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 23 05:07:15 np0005593294 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 23 05:07:15 np0005593294 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 05:07:15 np0005593294 systemd[1]: Started libvirt nodedev daemon.
Jan 23 05:07:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:15 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:15.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:16 np0005593294 python3.9[194289]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:16 np0005593294 systemd[1]: Reloading.
Jan 23 05:07:16 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:16 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:16.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:16 np0005593294 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 23 05:07:16 np0005593294 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 23 05:07:16 np0005593294 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 23 05:07:16 np0005593294 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 23 05:07:16 np0005593294 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 23 05:07:16 np0005593294 systemd[1]: Starting libvirt proxy daemon...
Jan 23 05:07:16 np0005593294 systemd[1]: Started libvirt proxy daemon.
Jan 23 05:07:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:16 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:16 np0005593294 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 23 05:07:16 np0005593294 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 23 05:07:16 np0005593294 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 23 05:07:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:17 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:17 np0005593294 python3.9[194508]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:17 np0005593294 systemd[1]: Reloading.
Jan 23 05:07:17 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:17 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:17 np0005593294 systemd[1]: Listening on libvirt locking daemon socket.
Jan 23 05:07:17 np0005593294 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 23 05:07:17 np0005593294 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 23 05:07:17 np0005593294 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 23 05:07:17 np0005593294 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 23 05:07:17 np0005593294 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 23 05:07:17 np0005593294 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 23 05:07:17 np0005593294 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 23 05:07:17 np0005593294 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 23 05:07:17 np0005593294 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 23 05:07:17 np0005593294 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 05:07:17 np0005593294 systemd[1]: Started libvirt QEMU daemon.
Jan 23 05:07:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:17 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:17.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:17 np0005593294 setroubleshoot[194326]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4de0b908-b857-4837-917a-7201a6fb06a8
Jan 23 05:07:17 np0005593294 setroubleshoot[194326]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 23 05:07:17 np0005593294 setroubleshoot[194326]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4de0b908-b857-4837-917a-7201a6fb06a8
Jan 23 05:07:17 np0005593294 setroubleshoot[194326]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 23 05:07:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:18.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:18 np0005593294 python3.9[194727]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:18 np0005593294 systemd[1]: Reloading.
Jan 23 05:07:18 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:18 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:18 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:18 np0005593294 systemd[1]: Starting libvirt secret daemon socket...
Jan 23 05:07:18 np0005593294 systemd[1]: Listening on libvirt secret daemon socket.
Jan 23 05:07:18 np0005593294 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 23 05:07:18 np0005593294 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 23 05:07:18 np0005593294 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 23 05:07:18 np0005593294 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 23 05:07:18 np0005593294 systemd[1]: Starting libvirt secret daemon...
Jan 23 05:07:18 np0005593294 systemd[1]: Started libvirt secret daemon.
Jan 23 05:07:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:19 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:19 np0005593294 python3.9[194939]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:19 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:19.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:20.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:20 np0005593294 python3.9[195116]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 05:07:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:21 np0005593294 python3.9[195268]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:21 np0005593294 python3.9[195423]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 05:07:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:21.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:22.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:22 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:22 np0005593294 python3.9[195573]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:23 np0005593294 python3.9[195694]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162842.3655317-3379-244738027878309/.source.xml follow=False _original_basename=secret.xml.j2 checksum=19688f6e42a741164eafec41a84b8e73a76d185a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:23.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:24 np0005593294 python3.9[195849]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine f3005f84-239a-55b6-a948-8f1fb592b920#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:24.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:24 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:24 np0005593294 python3.9[196011]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:25.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:26 np0005593294 auditd[701]: Audit daemon rotating log files
Jan 23 05:07:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:26.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:26 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:27 np0005593294 python3.9[196475]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:27 np0005593294 python3.9[196628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:27.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:28 np0005593294 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 23 05:07:28 np0005593294 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.064s CPU time.
Jan 23 05:07:28 np0005593294 podman[196723]: 2026-01-23 10:07:28.068672406 +0000 UTC m=+0.061427187 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:07:28 np0005593294 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 23 05:07:28 np0005593294 python3.9[196770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162847.2236483-3544-213809944158388/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:28.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:28 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:29 np0005593294 python3.9[196922]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:29 np0005593294 python3.9[197075]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:29.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:30 np0005593294 python3.9[197153]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:30.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:30 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:30 np0005593294 python3.9[197305]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:31 np0005593294 python3.9[197383]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qm76jcti recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:31.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:31 np0005593294 python3.9[197536]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:32.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:32 np0005593294 python3.9[197614]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:32 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:33 np0005593294 python3.9[197766]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:33 np0005593294 python3[197920]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 05:07:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:07:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:33.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:07:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:34.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:34 np0005593294 python3.9[198072]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:34 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:35 np0005593294 python3.9[198150]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:35 np0005593294 python3.9[198303]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:35.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:36 np0005593294 python3.9[198428]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162855.2519152-3811-183755194409300/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:36.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:36 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a450 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:36 np0005593294 python3.9[198580]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:37 np0005593294 python3.9[198658]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:37.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:38 np0005593294 python3.9[198811]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:38.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:38 np0005593294 python3.9[198889]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:38 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:39 np0005593294 python3.9[199041]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:39 np0005593294 python3.9[199167]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162858.6528866-3928-19840934855459/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:39.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:40.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:40 np0005593294 python3.9[199344]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:40 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:41 np0005593294 python3.9[199496]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:41.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:42 np0005593294 python3.9[199652]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:42.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:42 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:42 np0005593294 python3.9[199804]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:43 np0005593294 python3.9[199958]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:07:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:43.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:44.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:44 np0005593294 python3.9[200112]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:44 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:45 np0005593294 podman[200239]: 2026-01-23 10:07:45.252740273 +0000 UTC m=+0.145807207 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:07:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a4b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:45 np0005593294 python3.9[200286]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:07:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:45.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:07:46 np0005593294 python3.9[200446]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:07:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:46.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:07:46 np0005593294 python3.9[200569]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162865.5692914-4144-100914400769334/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:46 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:47 np0005593294 python3.9[200721]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a4d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:47 np0005593294 python3.9[200845]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162866.7798343-4189-11565546158488/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:47.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:48.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:48 np0005593294 python3.9[200997]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:48 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:49 np0005593294 python3.9[201120]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162868.0540428-4234-180452146909196/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:49 np0005593294 python3.9[201273]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:07:49 np0005593294 systemd[1]: Reloading.
Jan 23 05:07:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:49.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:50 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:50 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:50.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:50 np0005593294 systemd[1]: Reached target edpm_libvirt.target.
Jan 23 05:07:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:50 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a4f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:51 np0005593294 python3.9[201463]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 05:07:51 np0005593294 systemd[1]: Reloading.
Jan 23 05:07:51 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:51 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:51 np0005593294 systemd[1]: Reloading.
Jan 23 05:07:51 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:51 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:51.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:52.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:52 np0005593294 systemd[1]: session-52.scope: Deactivated successfully.
Jan 23 05:07:52 np0005593294 systemd[1]: session-52.scope: Consumed 3min 39.595s CPU time.
Jan 23 05:07:52 np0005593294 systemd-logind[807]: Session 52 logged out. Waiting for processes to exit.
Jan 23 05:07:52 np0005593294 systemd-logind[807]: Removed session 52.
Jan 23 05:07:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:52 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:53 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:53 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:53.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:07:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:54.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:07:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:54 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:07:55.034 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:07:55.034 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:07:55.035 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:55 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100755 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:07:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:55 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:55.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:56.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:56 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:57 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:07:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:07:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:07:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:07:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:57 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:57 np0005593294 systemd-logind[807]: New session 53 of user zuul.
Jan 23 05:07:57 np0005593294 systemd[1]: Started Session 53 of User zuul.
Jan 23 05:07:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:57.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:07:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:58.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:58 np0005593294 podman[201774]: 2026-01-23 10:07:58.66200035 +0000 UTC m=+0.061149178 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:07:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:58 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:58 np0005593294 python3.9[201810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:07:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:59.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:00 np0005593294 python3.9[201974]: ansible-ansible.builtin.service_facts Invoked
Jan 23 05:08:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:00.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:00 np0005593294 network[202016]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 05:08:00 np0005593294 network[202017]: 'network-scripts' will be removed from distribution in near future.
Jan 23 05:08:00 np0005593294 network[202018]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 05:08:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:00 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:01 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:01 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:08:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:02.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:08:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:02.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:02 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:03 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:03 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:08:03 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:08:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:03 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:04.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:04.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:04 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:05 np0005593294 python3.9[202317]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 05:08:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:08:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:06.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:06 np0005593294 python3.9[202402]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:08:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:06.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:06 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:08.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:08.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:08:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:08:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:09 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:09 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:10.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:10.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:10 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:11 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:11 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:12.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:12.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:12 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:08:12 np0005593294 python3.9[202558]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:08:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:12 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:13 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:13 np0005593294 python3.9[202711]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:13 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:14.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:14.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:14 np0005593294 python3.9[202864]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:08:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:14 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:15 np0005593294 python3.9[203016]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:15 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:15 np0005593294 podman[203142]: 2026-01-23 10:08:15.6966666 +0000 UTC m=+0.102312639 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:08:15 np0005593294 python3.9[203189]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:08:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:15 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:16.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:16.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:16 np0005593294 python3.9[203320]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162895.3735876-241-121184879443453/.source.iscsi _original_basename=.qq4qc0ky follow=False checksum=a41d40f9dbaa7a1982953c824d01a61d8b3c4d3a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:16 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:17 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:17 np0005593294 python3.9[203472]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100817 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:08:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:17 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:18.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:18 np0005593294 python3.9[203625]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:18.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:18 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:19 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:19 np0005593294 python3.9[203777]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:08:19 np0005593294 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 23 05:08:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:19 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:20.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:20 np0005593294 python3.9[203934]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:08:20 np0005593294 systemd[1]: Reloading.
Jan 23 05:08:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:20.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:20 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:20 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:20 np0005593294 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 05:08:20 np0005593294 systemd[1]: Starting Open-iSCSI...
Jan 23 05:08:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:20 np0005593294 kernel: Loading iSCSI transport class v2.0-870.
Jan 23 05:08:20 np0005593294 systemd[1]: Started Open-iSCSI.
Jan 23 05:08:20 np0005593294 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 23 05:08:20 np0005593294 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 23 05:08:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:22.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:22 np0005593294 python3.9[204160]: ansible-ansible.builtin.service_facts Invoked
Jan 23 05:08:22 np0005593294 network[204177]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 05:08:22 np0005593294 network[204178]: 'network-scripts' will be removed from distribution in near future.
Jan 23 05:08:22 np0005593294 network[204179]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 05:08:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:22.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:22 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003cf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:24.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:24.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:24 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:26.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:26.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:26 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:26 np0005593294 python3.9[204454]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:08:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:28.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:08:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:28.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:08:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:28 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:28 np0005593294 podman[204461]: 2026-01-23 10:08:28.834334576 +0000 UTC m=+0.062322125 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:08:29 np0005593294 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 05:08:29 np0005593294 systemd[1]: Starting man-db-cache-update.service...
Jan 23 05:08:29 np0005593294 systemd[1]: Reloading.
Jan 23 05:08:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:29 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:29 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:29 np0005593294 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 05:08:29 np0005593294 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 05:08:29 np0005593294 systemd[1]: Finished man-db-cache-update.service.
Jan 23 05:08:29 np0005593294 systemd[1]: run-rf5d332351af540ce86f4f3b8d1944ec1.service: Deactivated successfully.
Jan 23 05:08:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:30.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:30.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:30 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:31 np0005593294 python3.9[204794]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 05:08:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:32.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:32 np0005593294 python3.9[204946]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 23 05:08:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:32.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:32 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:33 np0005593294 python3.9[205102]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:08:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:33 np0005593294 python3.9[205226]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162912.5913036-505-230516159021048/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:34.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:34 np0005593294 python3.9[205378]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:34.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:34 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180030e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:35 np0005593294 python3.9[205530]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:08:35 np0005593294 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 05:08:35 np0005593294 systemd[1]: Stopped Load Kernel Modules.
Jan 23 05:08:35 np0005593294 systemd[1]: Stopping Load Kernel Modules...
Jan 23 05:08:35 np0005593294 systemd[1]: Starting Load Kernel Modules...
Jan 23 05:08:35 np0005593294 systemd[1]: Finished Load Kernel Modules.
Jan 23 05:08:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:36.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:08:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:36.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:08:36 np0005593294 python3.9[205687]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:36 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:37 np0005593294 python3.9[205840]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:08:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180030e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000065s ======
Jan 23 05:08:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:38.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Jan 23 05:08:38 np0005593294 python3.9[205993]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:08:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:08:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:38.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:08:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:38 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:38 np0005593294 python3.9[206116]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162917.7334728-658-89372904624181/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:39 np0005593294 python3.9[206268]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:40.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:40 np0005593294 python3.9[206422]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:40.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:40 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180030e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:41 np0005593294 python3.9[206574]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:41 np0005593294 python3.9[206752]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:42.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:42.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:42 np0005593294 python3.9[206904]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:42 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:43 np0005593294 python3.9[207057]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:43 np0005593294 python3.9[207210]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:44.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:44.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:44 np0005593294 python3.9[207362]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:44 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:45 np0005593294 python3.9[207514]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:08:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:45 np0005593294 podman[207669]: 2026-01-23 10:08:45.854567381 +0000 UTC m=+0.099261681 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 23 05:08:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:45 np0005593294 python3.9[207670]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:46.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:46.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:46 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:46 np0005593294 python3.9[207848]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:08:46 np0005593294 systemd[1]: Listening on multipathd control socket.
Jan 23 05:08:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:47 np0005593294 python3.9[208005]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:08:47 np0005593294 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 23 05:08:47 np0005593294 udevadm[208010]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 23 05:08:47 np0005593294 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 23 05:08:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:47 np0005593294 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 05:08:47 np0005593294 multipathd[208013]: --------start up--------
Jan 23 05:08:47 np0005593294 multipathd[208013]: read /etc/multipath.conf
Jan 23 05:08:47 np0005593294 multipathd[208013]: path checkers start up
Jan 23 05:08:47 np0005593294 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 05:08:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:48.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:48.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:48 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:48 np0005593294 python3.9[208172]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 05:08:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:49 np0005593294 python3.9[208325]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 23 05:08:49 np0005593294 kernel: Key type psk registered
Jan 23 05:08:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:50.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:50 np0005593294 python3.9[208489]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:08:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:50.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:50 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:50 np0005593294 python3.9[208612]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162929.9559839-1048-163712510742347/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:51 np0005593294 python3.9[208765]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:52.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:52.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:52 np0005593294 python3.9[208917]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:08:52 np0005593294 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 05:08:52 np0005593294 systemd[1]: Stopped Load Kernel Modules.
Jan 23 05:08:52 np0005593294 systemd[1]: Stopping Load Kernel Modules...
Jan 23 05:08:52 np0005593294 systemd[1]: Starting Load Kernel Modules...
Jan 23 05:08:52 np0005593294 systemd[1]: Finished Load Kernel Modules.
Jan 23 05:08:52 np0005593294 kernel: ganesha.nfsd[195820]: segfault at 50 ip 00007f8a9d96c32e sp 00007f8a097f9210 error 4 in libntirpc.so.5.8[7f8a9d951000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 05:08:52 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:08:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:52 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy ignored for local
Jan 23 05:08:52 np0005593294 systemd[1]: Started Process Core Dump (PID 208946/UID 0).
Jan 23 05:08:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:53 np0005593294 python3.9[209075]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:08:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:54.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:54 np0005593294 systemd-coredump[208947]: Process 180681 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007f8a9d96c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007f8a9d976900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 23 05:08:54 np0005593294 systemd[1]: systemd-coredump@7-208946-0.service: Deactivated successfully.
Jan 23 05:08:54 np0005593294 systemd[1]: systemd-coredump@7-208946-0.service: Consumed 1.349s CPU time.
Jan 23 05:08:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:54.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:54 np0005593294 podman[209082]: 2026-01-23 10:08:54.469528798 +0000 UTC m=+0.053568157 container died 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 23 05:08:54 np0005593294 systemd[1]: var-lib-containers-storage-overlay-c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd-merged.mount: Deactivated successfully.
Jan 23 05:08:55 np0005593294 podman[209082]: 2026-01-23 10:08:55.002994911 +0000 UTC m=+0.587034250 container remove 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Jan 23 05:08:55 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:08:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:08:55.035 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:08:55.036 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:08:55.036 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:55 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 05:08:55 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.015s CPU time.
Jan 23 05:08:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:56.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:56.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:56 np0005593294 systemd[1]: Reloading.
Jan 23 05:08:56 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:56 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:57 np0005593294 systemd[1]: Reloading.
Jan 23 05:08:57 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:57 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:57 np0005593294 systemd-logind[807]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 05:08:57 np0005593294 systemd-logind[807]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 05:08:57 np0005593294 lvm[209241]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 05:08:57 np0005593294 lvm[209241]: VG ceph_vg0 finished
Jan 23 05:08:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:58 np0005593294 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 05:08:58 np0005593294 systemd[1]: Starting man-db-cache-update.service...
Jan 23 05:08:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:58.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:58 np0005593294 systemd[1]: Reloading.
Jan 23 05:08:58 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:58 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:08:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:08:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:58.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:08:58 np0005593294 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 05:08:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100858 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:08:59 np0005593294 podman[209303]: 2026-01-23 10:08:59.663486819 +0000 UTC m=+0.059137374 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:09:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 05:09:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:00.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 05:09:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:00.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:01 np0005593294 python3.9[210639]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:09:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:02.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:02.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:02 np0005593294 systemd[1]: Stopping Open-iSCSI...
Jan 23 05:09:02 np0005593294 iscsid[204001]: iscsid shutting down.
Jan 23 05:09:02 np0005593294 systemd[1]: iscsid.service: Deactivated successfully.
Jan 23 05:09:02 np0005593294 systemd[1]: Stopped Open-iSCSI.
Jan 23 05:09:02 np0005593294 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 05:09:02 np0005593294 systemd[1]: Starting Open-iSCSI...
Jan 23 05:09:02 np0005593294 systemd[1]: Started Open-iSCSI.
Jan 23 05:09:02 np0005593294 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 05:09:02 np0005593294 systemd[1]: Finished man-db-cache-update.service.
Jan 23 05:09:02 np0005593294 systemd[1]: man-db-cache-update.service: Consumed 1.626s CPU time.
Jan 23 05:09:02 np0005593294 systemd[1]: run-rf772bd38b33d4e6f8d324c902ddfade4.service: Deactivated successfully.
Jan 23 05:09:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:03 np0005593294 python3.9[210863]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:09:03 np0005593294 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 23 05:09:03 np0005593294 multipathd[208013]: exit (signal)
Jan 23 05:09:03 np0005593294 multipathd[208013]: --------shut down-------
Jan 23 05:09:03 np0005593294 systemd[1]: multipathd.service: Deactivated successfully.
Jan 23 05:09:03 np0005593294 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 23 05:09:03 np0005593294 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 05:09:03 np0005593294 multipathd[210884]: --------start up--------
Jan 23 05:09:03 np0005593294 multipathd[210884]: read /etc/multipath.conf
Jan 23 05:09:03 np0005593294 multipathd[210884]: path checkers start up
Jan 23 05:09:03 np0005593294 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 05:09:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:04.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:04.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:04 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:04 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:05 np0005593294 python3.9[211041]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:09:05 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 8.
Jan 23 05:09:05 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:09:05 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.015s CPU time.
Jan 23 05:09:05 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:09:05 np0005593294 podman[211094]: 2026-01-23 10:09:05.602935554 +0000 UTC m=+0.054142005 container create f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 05:09:05 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1603e4fd7186137f2b1d3aaf85baad9e68881d714d19ff2b8a04631378ac54fb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:09:05 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1603e4fd7186137f2b1d3aaf85baad9e68881d714d19ff2b8a04631378ac54fb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:09:05 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1603e4fd7186137f2b1d3aaf85baad9e68881d714d19ff2b8a04631378ac54fb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:09:05 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1603e4fd7186137f2b1d3aaf85baad9e68881d714d19ff2b8a04631378ac54fb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:09:05 np0005593294 podman[211094]: 2026-01-23 10:09:05.580583403 +0000 UTC m=+0.031789874 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:09:05 np0005593294 podman[211094]: 2026-01-23 10:09:05.682040703 +0000 UTC m=+0.133247174 container init f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:09:05 np0005593294 podman[211094]: 2026-01-23 10:09:05.688645344 +0000 UTC m=+0.139851795 container start f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:09:05 np0005593294 bash[211094]: f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6
Jan 23 05:09:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:09:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:09:05 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:09:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:09:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:09:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:09:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:09:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:09:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.929152) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945929625, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1314, "num_deletes": 260, "total_data_size": 3217789, "memory_usage": 3265072, "flush_reason": "Manual Compaction"}
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945945963, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2098879, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17974, "largest_seqno": 19283, "table_properties": {"data_size": 2093326, "index_size": 2947, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11386, "raw_average_key_size": 18, "raw_value_size": 2082056, "raw_average_value_size": 3402, "num_data_blocks": 132, "num_entries": 612, "num_filter_entries": 612, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162834, "oldest_key_time": 1769162834, "file_creation_time": 1769162945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 16575 microseconds, and 7901 cpu microseconds.
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.946068) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2098879 bytes OK
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.946107) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.948727) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.948764) EVENT_LOG_v1 {"time_micros": 1769162945948759, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.948786) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3211535, prev total WAL file size 3211535, number of live WAL files 2.
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.950157) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323536' seq:0, type:0; will stop at (end)
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2049KB)], [33(11MB)]
Jan 23 05:09:05 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945950245, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 13884958, "oldest_snapshot_seqno": -1}
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4955 keys, 13427756 bytes, temperature: kUnknown
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946050859, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13427756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13392837, "index_size": 21433, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12421, "raw_key_size": 126107, "raw_average_key_size": 25, "raw_value_size": 13300862, "raw_average_value_size": 2684, "num_data_blocks": 881, "num_entries": 4955, "num_filter_entries": 4955, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.051572) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13427756 bytes
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.053895) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 137.7 rd, 133.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.2 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(13.0) write-amplify(6.4) OK, records in: 5489, records dropped: 534 output_compression: NoCompression
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.053932) EVENT_LOG_v1 {"time_micros": 1769162946053918, "job": 18, "event": "compaction_finished", "compaction_time_micros": 100857, "compaction_time_cpu_micros": 30813, "output_level": 6, "num_output_files": 1, "total_output_size": 13427756, "num_input_records": 5489, "num_output_records": 4955, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946054801, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946058389, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.950023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058574) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058583) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058584) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058587) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:06.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:09:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:06.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:09:06 np0005593294 python3.9[211303]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:08.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:08 np0005593294 python3.9[211456]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:09:08 np0005593294 systemd[1]: Reloading.
Jan 23 05:09:08 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:09:08 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:09:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:08.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:09 np0005593294 python3.9[211641]: ansible-ansible.builtin.service_facts Invoked
Jan 23 05:09:09 np0005593294 network[211658]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 05:09:09 np0005593294 network[211659]: 'network-scripts' will be removed from distribution in near future.
Jan 23 05:09:09 np0005593294 network[211660]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 05:09:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:09:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:10.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:09:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:10.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:11 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:11 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:09:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:09:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:12.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:12.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:14 np0005593294 python3.9[211961]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:14.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:14.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:14 np0005593294 python3.9[212114]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:15 np0005593294 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 23 05:09:15 np0005593294 python3.9[212267]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:15 np0005593294 podman[212394]: 2026-01-23 10:09:15.99444116 +0000 UTC m=+0.086678431 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:09:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:16.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:16 np0005593294 python3.9[212441]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:16.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:16 np0005593294 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 05:09:17 np0005593294 python3.9[212602]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:09:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:17 np0005593294 python3.9[212756]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:18.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:18.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:18 np0005593294 python3.9[212924]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:18 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:19 np0005593294 python3.9[213077]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:09:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:20.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:09:20 np0005593294 python3.9[213231]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:20.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100920 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:09:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:20 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:21 np0005593294 python3.9[213383]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:21 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:21 np0005593294 python3.9[213561]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:21 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:22.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:22 np0005593294 python3.9[213713]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:22.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:22 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:22 np0005593294 python3.9[213865]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:23 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:23 np0005593294 python3.9[214017]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:23 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:24.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:24 np0005593294 python3.9[214170]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:24.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:24 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:24 np0005593294 python3.9[214322]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:25 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:25 np0005593294 python3.9[214474]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:25 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:26 np0005593294 python3.9[214627]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:26.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:26 np0005593294 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 05:09:26 np0005593294 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 23 05:09:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:26.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:26 np0005593294 python3.9[214781]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:26 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:27 np0005593294 python3.9[214933]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:27 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:27 np0005593294 python3.9[215086]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:27 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:28.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:28 np0005593294 python3.9[215238]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:09:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:28.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:09:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:28 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:29 np0005593294 python3.9[215390]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:29 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:29 np0005593294 python3.9[215543]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:29 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:09:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:30.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:09:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:30.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:30 np0005593294 podman[215667]: 2026-01-23 10:09:30.513759434 +0000 UTC m=+0.063630556 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 05:09:30 np0005593294 python3.9[215708]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:30 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:31 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:31 np0005593294 python3.9[215867]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 05:09:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:31 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:32.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:32 np0005593294 python3.9[216020]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:09:32 np0005593294 systemd[1]: Reloading.
Jan 23 05:09:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:32.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:32 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:09:32 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:09:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:32 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:33 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:33 np0005593294 python3.9[216207]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:33 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:34 np0005593294 python3.9[216361]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:34.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:34.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:34 np0005593294 python3.9[216514]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:34 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:35 np0005593294 python3.9[216667]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:35 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:35 np0005593294 python3.9[216821]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:35 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:36.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:36 np0005593294 python3.9[216974]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:36.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:36 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:37 np0005593294 python3.9[217127]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:37 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:37 np0005593294 python3.9[217281]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:37 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:38.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:38.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:38 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:39 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:39 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:40.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:40.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:40 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:41 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:41 np0005593294 python3.9[217461]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:41 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:42.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:42 np0005593294 python3.9[217613]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:42.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:42 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:43 np0005593294 python3.9[217765]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:43 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:43 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:44.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:44 np0005593294 python3.9[217918]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:44.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:44 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:44 np0005593294 python3.9[218070]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:45 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:45 np0005593294 python3.9[218223]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:45 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:46.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:46 np0005593294 podman[218347]: 2026-01-23 10:09:46.281719143 +0000 UTC m=+0.104603962 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:09:46 np0005593294 python3.9[218390]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:46.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:46 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:47 np0005593294 python3.9[218551]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:47 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:47 np0005593294 python3.9[218704]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:47 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:48.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:48 np0005593294 python3.9[218856]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:09:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:48.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:09:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:48 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:49 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:49 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:50.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:50.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:50 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:51 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:09:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1204.3 total, 600.0 interval#012Cumulative writes: 9060 writes, 35K keys, 9060 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 9060 writes, 1959 syncs, 4.62 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 781 writes, 1248 keys, 781 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s#012Interval WAL: 781 writes, 366 syncs, 2.13 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1204.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1204.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1204.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Jan 23 05:09:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:51 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68000d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:52.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:52.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:52 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:53 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:53 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:54.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:54.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:54 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:54 np0005593294 python3.9[219013]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 23 05:09:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:09:55.036 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:09:55.037 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:09:55.037 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:55 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:55 np0005593294 python3.9[219167]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 05:09:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:55 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:56.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:56.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:56 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:57 np0005593294 python3.9[219325]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 05:09:57 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:09:57 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:09:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:58.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:58 np0005593294 systemd-logind[807]: New session 54 of user zuul.
Jan 23 05:09:58 np0005593294 systemd[1]: Started Session 54 of User zuul.
Jan 23 05:09:58 np0005593294 systemd[1]: session-54.scope: Deactivated successfully.
Jan 23 05:09:58 np0005593294 systemd-logind[807]: Session 54 logged out. Waiting for processes to exit.
Jan 23 05:09:58 np0005593294 systemd-logind[807]: Removed session 54.
Jan 23 05:09:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:09:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:09:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:58.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:09:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:58 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:59 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:59 np0005593294 python3.9[219513]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:09:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:59 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:00 np0005593294 python3.9[219635]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162998.9841406-2655-224926547727298/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:00.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:00 np0005593294 ceph-mon[80126]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:10:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:00.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:00 np0005593294 podman[219759]: 2026-01-23 10:10:00.669451658 +0000 UTC m=+0.057738510 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:10:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:00 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:00 np0005593294 python3.9[219796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:01 np0005593294 python3.9[219878]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:01 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:01 np0005593294 python3.9[220054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:01 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:02.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:02 np0005593294 python3.9[220175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163001.4491875-2655-276195173207180/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:02.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:02 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:03 np0005593294 python3.9[220325]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:03 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:03 np0005593294 python3.9[220447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163002.58453-2655-13136816431452/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:03 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:10:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:04.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:10:04 np0005593294 python3.9[220597]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:04.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:04 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:05 np0005593294 python3.9[220718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163004.04586-2655-226613656211426/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:05 np0005593294 python3.9[220869]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:06.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:06 np0005593294 python3.9[220990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163005.4865828-2655-171246921296239/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:06.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:06 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:07 np0005593294 python3.9[221142]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:10:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:07 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:07 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:07 np0005593294 python3.9[221295]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:10:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:08.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:08.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:08 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:08 np0005593294 python3.9[221447]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:09 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:09 np0005593294 python3.9[221599]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:09 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:10 np0005593294 python3.9[221723]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769163009.0390093-2977-42477984145074/.source _original_basename=.1srvojmh follow=False checksum=43aa8ea3ed4ec99d1d20bccd165c6d046c0b601f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 23 05:10:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:10.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:10.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:10 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:10 np0005593294 python3.9[221875]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:11 np0005593294 python3.9[222061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:12 np0005593294 python3.9[222270]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163011.1739957-3054-64684880345036/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=53b8456782b81b5794d3eef3fadcfb00db1088a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:12.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:12.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:12 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:10:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:10:13 np0005593294 python3.9[222451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:13 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:13 np0005593294 python3.9[222572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163012.319673-3099-110490860432733/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0333d3a3f5c3a0526b0ebe430250032166710e8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:13 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:14.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:14.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:14 np0005593294 python3.9[222725]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 23 05:10:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:14 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:15 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:15 np0005593294 python3.9[222878]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 05:10:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:15 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:16.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:16.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:16 np0005593294 podman[222997]: 2026-01-23 10:10:16.715716777 +0000 UTC m=+0.113123783 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:10:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:16 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:17 np0005593294 python3[223054]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 05:10:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:18 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:18 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:18.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:18.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:18 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:20.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:20.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:20 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:21 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:21 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:22.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:22.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:22 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:23 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:23 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:24.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:24.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:24 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:25 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:25 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:26.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:26.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:26 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:27 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:27 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:28 np0005593294 podman[223071]: 2026-01-23 10:10:28.120728152 +0000 UTC m=+11.044589546 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 05:10:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:28.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:28 np0005593294 podman[223218]: 2026-01-23 10:10:28.253014872 +0000 UTC m=+0.024693825 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 05:10:28 np0005593294 podman[223218]: 2026-01-23 10:10:28.395198245 +0000 UTC m=+0.166877108 container create cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 05:10:28 np0005593294 python3[223054]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 23 05:10:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:28.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:28 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:29 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:29 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:30.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:30 np0005593294 python3.9[223410]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:30.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:30 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:31 np0005593294 podman[223536]: 2026-01-23 10:10:31.465560369 +0000 UTC m=+0.058228109 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:10:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:31 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:31 np0005593294 python3.9[223582]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 23 05:10:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:31 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:32.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:32 np0005593294 python3.9[223735]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 05:10:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:32.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:32 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:33 np0005593294 python3[223887]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 05:10:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:33 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:33 np0005593294 podman[223923]: 2026-01-23 10:10:33.634669604 +0000 UTC m=+0.051720912 container create d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 05:10:33 np0005593294 podman[223923]: 2026-01-23 10:10:33.605599342 +0000 UTC m=+0.022650670 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 05:10:33 np0005593294 python3[223887]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b kolla_start
Jan 23 05:10:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:33 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:34.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:34.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:34 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:35 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:35 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:36.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:36.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:36 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:37 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:37 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:38.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:38.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:38 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:39 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:39 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:40.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:40.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:40 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:41 np0005593294 python3.9[224114]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:41 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:41 np0005593294 python3.9[224294]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:10:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:41 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:42.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:42 np0005593294 python3.9[224445]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769163041.9171777-3387-71067633594809/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:10:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:42.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:42 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:43 np0005593294 python3.9[224521]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:10:43 np0005593294 systemd[1]: Reloading.
Jan 23 05:10:43 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:10:43 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:10:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:43 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:43 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:44 np0005593294 python3.9[224632]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:10:44 np0005593294 systemd[1]: Reloading.
Jan 23 05:10:44 np0005593294 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:10:44 np0005593294 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:10:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:44.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:44 np0005593294 systemd[1]: Starting nova_compute container...
Jan 23 05:10:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:44.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:44 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:10:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:44 np0005593294 podman[224672]: 2026-01-23 10:10:44.716180301 +0000 UTC m=+0.099495599 container init d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:10:44 np0005593294 podman[224672]: 2026-01-23 10:10:44.721598513 +0000 UTC m=+0.104913781 container start d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 05:10:44 np0005593294 podman[224672]: nova_compute
Jan 23 05:10:44 np0005593294 nova_compute[224687]: + sudo -E kolla_set_configs
Jan 23 05:10:44 np0005593294 systemd[1]: Started nova_compute container.
Jan 23 05:10:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:44 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Validating config file
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying service configuration files
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Deleting /etc/ceph
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Creating directory /etc/ceph
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Writing out command to execute
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:44 np0005593294 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 05:10:44 np0005593294 nova_compute[224687]: ++ cat /run_command
Jan 23 05:10:44 np0005593294 nova_compute[224687]: + CMD=nova-compute
Jan 23 05:10:44 np0005593294 nova_compute[224687]: + ARGS=
Jan 23 05:10:44 np0005593294 nova_compute[224687]: + sudo kolla_copy_cacerts
Jan 23 05:10:44 np0005593294 nova_compute[224687]: + [[ ! -n '' ]]
Jan 23 05:10:44 np0005593294 nova_compute[224687]: + . kolla_extend_start
Jan 23 05:10:44 np0005593294 nova_compute[224687]: Running command: 'nova-compute'
Jan 23 05:10:44 np0005593294 nova_compute[224687]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 05:10:44 np0005593294 nova_compute[224687]: + umask 0022
Jan 23 05:10:44 np0005593294 nova_compute[224687]: + exec nova-compute
Jan 23 05:10:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:45 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101045 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:10:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:45 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:46.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:46.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:46 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:47 np0005593294 nova_compute[224687]: 2026-01-23 10:10:47.084 224691 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:47 np0005593294 nova_compute[224687]: 2026-01-23 10:10:47.084 224691 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:47 np0005593294 nova_compute[224687]: 2026-01-23 10:10:47.084 224691 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:47 np0005593294 nova_compute[224687]: 2026-01-23 10:10:47.084 224691 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 23 05:10:47 np0005593294 nova_compute[224687]: 2026-01-23 10:10:47.239 224691 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:47 np0005593294 nova_compute[224687]: 2026-01-23 10:10:47.275 224691 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:47 np0005593294 nova_compute[224687]: 2026-01-23 10:10:47.276 224691 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 05:10:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:10:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3512 writes, 20K keys, 3512 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.04 MB/s#012Cumulative WAL: 3512 writes, 3512 syncs, 1.00 writes per sync, written: 0.05 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1356 writes, 6394 keys, 1356 commit groups, 1.0 writes per commit group, ingest: 16.20 MB, 0.03 MB/s#012Interval WAL: 1356 writes, 1356 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     47.1      0.59              0.08         9    0.065       0      0       0.0       0.0#012  L6      1/0   12.81 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    135.0    117.2      0.84              0.29         8    0.105     39K   4175       0.0       0.0#012 Sum      1/0   12.81 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     79.2     88.2      1.43              0.37        17    0.084     39K   4175       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.1    127.6    128.2      0.34              0.13         6    0.057     16K   1877       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    135.0    117.2      0.84              0.29         8    0.105     39K   4175       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     47.2      0.59              0.08         8    0.073       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.027, interval 0.007#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.12 GB write, 0.10 MB/s write, 0.11 GB read, 0.09 MB/s read, 1.4 seconds#012Interval compaction: 0.04 GB write, 0.07 MB/s write, 0.04 GB read, 0.07 MB/s read, 0.3 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 4.88 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000142 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(262,4.55 MB,1.4973%) FilterBlock(17,118.48 KB,0.0380616%) IndexBlock(17,221.48 KB,0.0711491%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:10:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:47 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:47 np0005593294 podman[224829]: 2026-01-23 10:10:47.60490502 +0000 UTC m=+0.092729145 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:10:47 np0005593294 python3.9[224866]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:47 np0005593294 nova_compute[224687]: 2026-01-23 10:10:47.802 224691 INFO nova.virt.driver [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 23 05:10:47 np0005593294 nova_compute[224687]: 2026-01-23 10:10:47.933 224691 INFO nova.compute.provider_config [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 23 05:10:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:47 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.064 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.134 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.134 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.134 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.134 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.143 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.143 224691 WARNING oslo_config.cfg [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 05:10:48 np0005593294 nova_compute[224687]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 05:10:48 np0005593294 nova_compute[224687]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 05:10:48 np0005593294 nova_compute[224687]: and ``live_migration_inbound_addr`` respectively.
Jan 23 05:10:48 np0005593294 nova_compute[224687]: ).  Its value may be silently ignored in the future.#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.143 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.143 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_secret_uuid        = f3005f84-239a-55b6-a948-8f1fb592b920 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.152 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.152 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.152 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.152 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.169 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.169 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.169 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.169 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.176 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.176 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.176 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.176 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.180 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.180 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.180 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.180 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.189 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.189 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.189 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.189 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.190 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.190 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.190 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.190 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.200 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.200 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.200 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.200 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.203 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.203 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.203 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.224 224691 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.292 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.292 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.293 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.293 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 23 05:10:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:48.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:48 np0005593294 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 05:10:48 np0005593294 systemd[1]: Started libvirt QEMU daemon.
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.375 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f0b935f8880> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.380 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f0b935f8880> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.382 224691 INFO nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.512 224691 WARNING nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 23 05:10:48 np0005593294 nova_compute[224687]: 2026-01-23 10:10:48.512 224691 DEBUG nova.virt.libvirt.volume.mount [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 23 05:10:48 np0005593294 python3.9[225074]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:10:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:48.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:10:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:48 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.243 224691 INFO nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <host>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <uuid>53821a39-1f4a-4bf2-b036-ba3044ea8780</uuid>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <arch>x86_64</arch>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model>EPYC-Rome-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <vendor>AMD</vendor>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <microcode version='16777317'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <signature family='23' model='49' stepping='0'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='x2apic'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='tsc-deadline'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='osxsave'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='hypervisor'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='tsc_adjust'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='spec-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='stibp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='arch-capabilities'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='cmp_legacy'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='topoext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='virt-ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='lbrv'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='tsc-scale'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='vmcb-clean'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='pause-filter'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='pfthreshold'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='svme-addr-chk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='rdctl-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='skip-l1dfl-vmentry'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='mds-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature name='pschange-mc-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <pages unit='KiB' size='4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <pages unit='KiB' size='2048'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <pages unit='KiB' size='1048576'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <power_management>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <suspend_mem/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </power_management>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <iommu support='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <migration_features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <live/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <uri_transports>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <uri_transport>tcp</uri_transport>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <uri_transport>rdma</uri_transport>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </uri_transports>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </migration_features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <topology>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <cells num='1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <cell id='0'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:          <memory unit='KiB'>7864316</memory>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:          <pages unit='KiB' size='4'>1966079</pages>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:          <pages unit='KiB' size='2048'>0</pages>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:          <distances>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:            <sibling id='0' value='10'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:          </distances>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:          <cpus num='8'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:          </cpus>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        </cell>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </cells>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </topology>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <cache>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </cache>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <secmodel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model>selinux</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <doi>0</doi>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </secmodel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <secmodel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model>dac</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <doi>0</doi>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </secmodel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </host>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <guest>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <os_type>hvm</os_type>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <arch name='i686'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <wordsize>32</wordsize>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <domain type='qemu'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <domain type='kvm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </arch>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <pae/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <nonpae/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <acpi default='on' toggle='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <apic default='on' toggle='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <cpuselection/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <deviceboot/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <disksnapshot default='on' toggle='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <externalSnapshot/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </guest>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <guest>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <os_type>hvm</os_type>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <arch name='x86_64'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <wordsize>64</wordsize>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <domain type='qemu'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <domain type='kvm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </arch>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <acpi default='on' toggle='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <apic default='on' toggle='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <cpuselection/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <deviceboot/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <disksnapshot default='on' toggle='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <externalSnapshot/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </guest>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 
Jan 23 05:10:49 np0005593294 nova_compute[224687]: </capabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: #033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.249 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.270 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 05:10:49 np0005593294 nova_compute[224687]: <domainCapabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <domain>kvm</domain>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <arch>i686</arch>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <vcpu max='4096'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <iothreads supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <os supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <enum name='firmware'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <loader supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>rom</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pflash</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='readonly'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>yes</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>no</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='secure'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>no</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </loader>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </os>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>on</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>off</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='maximumMigratable'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>on</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>off</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <vendor>AMD</vendor>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='succor'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='custom' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ddpd-u'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sha512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ddpd-u'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sha512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbpb'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbpb'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-128'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-256'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-128'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-256'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='KnightsMill'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512er'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512pf'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512er'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512pf'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tbm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tbm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='athlon'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='athlon-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='core2duo'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='core2duo-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='coreduo'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='coreduo-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='n270'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='n270-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='phenom'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='phenom-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <memoryBacking supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <enum name='sourceType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>file</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>anonymous</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>memfd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </memoryBacking>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <devices>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <disk supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='diskDevice'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>disk</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>cdrom</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>floppy</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>lun</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='bus'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>fdc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>scsi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>sata</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-non-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </disk>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <graphics supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vnc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>egl-headless</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dbus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </graphics>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <video supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='modelType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vga</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>cirrus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>none</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>bochs</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ramfb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </video>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <hostdev supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='mode'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>subsystem</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='startupPolicy'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>default</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>mandatory</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>requisite</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>optional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='subsysType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pci</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>scsi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='capsType'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='pciBackend'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </hostdev>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <rng supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-non-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>random</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>egd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>builtin</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </rng>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <filesystem supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='driverType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>path</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>handle</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtiofs</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </filesystem>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <tpm supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tpm-tis</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tpm-crb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>emulator</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>external</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendVersion'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>2.0</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </tpm>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <redirdev supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='bus'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </redirdev>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <channel supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pty</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>unix</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </channel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <crypto supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>qemu</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>builtin</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </crypto>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <interface supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>default</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>passt</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </interface>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <panic supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>isa</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>hyperv</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </panic>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <console supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>null</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pty</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dev</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>file</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pipe</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>stdio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>udp</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tcp</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>unix</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>qemu-vdagent</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dbus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </console>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </devices>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <gic supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <genid supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <backup supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <async-teardown supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <s390-pv supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <ps2 supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <tdx supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <sev supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <sgx supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <hyperv supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='features'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>relaxed</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vapic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>spinlocks</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vpindex</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>runtime</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>synic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>stimer</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>reset</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vendor_id</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>frequencies</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>reenlightenment</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tlbflush</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ipi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>avic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>emsr_bitmap</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>xmm_input</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <defaults>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </defaults>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </hyperv>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <launchSecurity supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: </domainCapabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.282 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 05:10:49 np0005593294 nova_compute[224687]: <domainCapabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <domain>kvm</domain>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <arch>i686</arch>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <vcpu max='240'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <iothreads supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <os supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <enum name='firmware'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <loader supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>rom</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pflash</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='readonly'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>yes</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>no</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='secure'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>no</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </loader>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </os>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>on</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>off</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='maximumMigratable'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>on</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>off</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <vendor>AMD</vendor>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='succor'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='custom' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ddpd-u'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sha512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ddpd-u'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sha512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbpb'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbpb'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-128'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-256'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-128'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-256'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='KnightsMill'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512er'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512pf'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512er'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512pf'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tbm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tbm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='athlon'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='athlon-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='core2duo'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='core2duo-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='coreduo'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='coreduo-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='n270'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='n270-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='phenom'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='phenom-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <memoryBacking supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <enum name='sourceType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>file</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>anonymous</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>memfd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </memoryBacking>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <devices>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <disk supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='diskDevice'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>disk</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>cdrom</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>floppy</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>lun</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='bus'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ide</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>fdc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>scsi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>sata</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-non-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </disk>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <graphics supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vnc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>egl-headless</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dbus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </graphics>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <video supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='modelType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vga</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>cirrus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>none</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>bochs</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ramfb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </video>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <hostdev supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='mode'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>subsystem</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='startupPolicy'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>default</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>mandatory</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>requisite</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>optional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='subsysType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pci</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>scsi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='capsType'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='pciBackend'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </hostdev>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <rng supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-non-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>random</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>egd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>builtin</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </rng>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <filesystem supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='driverType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>path</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>handle</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtiofs</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </filesystem>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <tpm supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tpm-tis</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tpm-crb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>emulator</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>external</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendVersion'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>2.0</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </tpm>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <redirdev supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='bus'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </redirdev>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <channel supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pty</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>unix</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </channel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <crypto supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>qemu</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>builtin</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </crypto>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <interface supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>default</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>passt</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </interface>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <panic supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>isa</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>hyperv</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </panic>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <console supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>null</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pty</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dev</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>file</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pipe</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>stdio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>udp</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tcp</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>unix</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>qemu-vdagent</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dbus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </console>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </devices>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <gic supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <genid supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <backup supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <async-teardown supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <s390-pv supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <ps2 supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <tdx supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <sev supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <sgx supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <hyperv supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='features'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>relaxed</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vapic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>spinlocks</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vpindex</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>runtime</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>synic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>stimer</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>reset</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vendor_id</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>frequencies</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>reenlightenment</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tlbflush</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ipi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>avic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>emsr_bitmap</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>xmm_input</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <defaults>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </defaults>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </hyperv>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <launchSecurity supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: </domainCapabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.329 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.334 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 05:10:49 np0005593294 nova_compute[224687]: <domainCapabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <domain>kvm</domain>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <arch>x86_64</arch>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <vcpu max='240'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <iothreads supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <os supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <enum name='firmware'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <loader supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>rom</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pflash</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='readonly'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>yes</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>no</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='secure'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>no</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </loader>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </os>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>on</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>off</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='maximumMigratable'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>on</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>off</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <vendor>AMD</vendor>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='succor'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='custom' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ddpd-u'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sha512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ddpd-u'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sha512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbpb'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbpb'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-128'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-256'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-128'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-256'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='KnightsMill'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512er'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512pf'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512er'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512pf'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tbm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tbm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='athlon'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='athlon-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='core2duo'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='core2duo-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='coreduo'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='coreduo-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='n270'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='n270-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='phenom'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='phenom-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <memoryBacking supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <enum name='sourceType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>file</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>anonymous</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>memfd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </memoryBacking>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <devices>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <disk supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='diskDevice'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>disk</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>cdrom</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>floppy</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>lun</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='bus'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ide</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>fdc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>scsi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>sata</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-non-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </disk>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <graphics supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vnc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>egl-headless</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dbus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </graphics>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <video supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='modelType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vga</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>cirrus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>none</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>bochs</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ramfb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </video>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <hostdev supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='mode'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>subsystem</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='startupPolicy'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>default</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>mandatory</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>requisite</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>optional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='subsysType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pci</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>scsi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='capsType'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='pciBackend'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </hostdev>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <rng supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-non-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>random</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>egd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>builtin</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </rng>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <filesystem supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='driverType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>path</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>handle</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtiofs</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </filesystem>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <tpm supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tpm-tis</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tpm-crb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>emulator</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>external</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendVersion'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>2.0</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </tpm>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <redirdev supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='bus'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </redirdev>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <channel supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pty</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>unix</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </channel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <crypto supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>qemu</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>builtin</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </crypto>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <interface supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>default</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>passt</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </interface>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <panic supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>isa</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>hyperv</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </panic>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <console supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>null</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pty</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dev</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>file</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pipe</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>stdio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>udp</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tcp</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>unix</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>qemu-vdagent</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dbus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </console>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </devices>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <gic supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <genid supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <backup supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <async-teardown supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <s390-pv supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <ps2 supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <tdx supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <sev supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <sgx supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <hyperv supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='features'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>relaxed</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vapic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>spinlocks</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vpindex</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>runtime</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>synic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>stimer</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>reset</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vendor_id</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>frequencies</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>reenlightenment</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tlbflush</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ipi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>avic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>emsr_bitmap</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>xmm_input</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <defaults>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </defaults>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </hyperv>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <launchSecurity supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: </domainCapabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.405 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 05:10:49 np0005593294 nova_compute[224687]: <domainCapabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <domain>kvm</domain>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <arch>x86_64</arch>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <vcpu max='4096'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <iothreads supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <os supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <enum name='firmware'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>efi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <loader supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>rom</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pflash</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='readonly'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>yes</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>no</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='secure'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>yes</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>no</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </loader>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </os>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>on</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>off</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='maximumMigratable'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>on</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>off</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <vendor>AMD</vendor>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='succor'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <mode name='custom' supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ddpd-u'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sha512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ddpd-u'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sha512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm3'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sm4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Denverton-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbpb'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amd-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='auto-ibrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='perfmon-v2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbpb'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='stibp-always-on'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='EPYC-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-128'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-256'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-128'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-256'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx10-512'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='prefetchiti'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v1'>
Jan 23 05:10:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:49 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Haswell-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='KnightsMill'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512er'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512pf'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512er'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512pf'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tbm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fma4'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tbm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xop'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='amx-tile'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-bf16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-fp16'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bitalg'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrc'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fzrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='la57'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='taa-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ifma'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cmpccxadd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fbsdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='fsrs'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ibrs-all'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='intel-psfd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='lam'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mcdt-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pbrsb-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='psdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='serialize'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vaes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='hle'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='rtm'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512bw'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512cd'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512dq'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512f'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='avx512vl'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='invpcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pcid'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='pku'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='mpx'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='core-capability'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='split-lock-detect'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='cldemote'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='erms'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='gfni'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdir64b'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='movdiri'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='xsaves'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='athlon'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='athlon-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='core2duo'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='core2duo-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='coreduo'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='coreduo-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='n270'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='n270-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='ss'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='phenom'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <blockers model='phenom-v1'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnow'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <feature name='3dnowext'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </blockers>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </mode>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </cpu>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <memoryBacking supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <enum name='sourceType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>file</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>anonymous</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <value>memfd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </memoryBacking>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <devices>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <disk supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='diskDevice'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>disk</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>cdrom</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>floppy</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>lun</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='bus'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>fdc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>scsi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>sata</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-non-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </disk>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <graphics supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vnc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>egl-headless</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dbus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </graphics>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <video supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='modelType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vga</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>cirrus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>none</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>bochs</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ramfb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </video>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <hostdev supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='mode'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>subsystem</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='startupPolicy'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>default</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>mandatory</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>requisite</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>optional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='subsysType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pci</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>scsi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='capsType'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='pciBackend'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </hostdev>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <rng supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtio-non-transitional</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>random</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>egd</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>builtin</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </rng>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <filesystem supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='driverType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>path</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>handle</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>virtiofs</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </filesystem>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <tpm supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tpm-tis</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tpm-crb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>emulator</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>external</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendVersion'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>2.0</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </tpm>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <redirdev supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='bus'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>usb</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </redirdev>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <channel supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pty</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>unix</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </channel>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <crypto supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>qemu</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendModel'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>builtin</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </crypto>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <interface supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='backendType'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>default</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>passt</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </interface>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <panic supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='model'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>isa</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>hyperv</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </panic>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <console supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='type'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>null</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vc</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pty</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dev</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>file</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>pipe</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>stdio</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>udp</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tcp</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>unix</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>qemu-vdagent</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>dbus</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </console>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </devices>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  <features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <gic supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <genid supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <backup supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <async-teardown supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <s390-pv supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <ps2 supported='yes'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <tdx supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <sev supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <sgx supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <hyperv supported='yes'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <enum name='features'>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>relaxed</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vapic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>spinlocks</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vpindex</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>runtime</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>synic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>stimer</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>reset</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>vendor_id</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>frequencies</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>reenlightenment</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>tlbflush</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>ipi</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>avic</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>emsr_bitmap</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <value>xmm_input</value>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </enum>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      <defaults>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:      </defaults>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    </hyperv>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:    <launchSecurity supported='no'/>
Jan 23 05:10:49 np0005593294 nova_compute[224687]:  </features>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: </domainCapabilities>
Jan 23 05:10:49 np0005593294 nova_compute[224687]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.473 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.474 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.474 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.478 224691 INFO nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Secure Boot support detected#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.480 224691 INFO nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.481 224691 INFO nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.494 224691 DEBUG nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 23 05:10:49 np0005593294 python3.9[225242]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.704 224691 INFO nova.virt.node [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Determined node identity b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from /var/lib/nova/compute_id#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.730 224691 WARNING nova.compute.manager [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Compute nodes ['b22b6ed5-7bca-42dc-9b99-6f2ad6853af7'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.790 224691 INFO nova.compute.manager [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.833 224691 WARNING nova.compute.manager [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.834 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.834 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.835 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.835 224691 DEBUG nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:10:49 np0005593294 nova_compute[224687]: 2026-01-23 10:10:49.835 224691 DEBUG oslo_concurrency.processutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:49 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:50.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:50.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:50 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:10:50 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1010858224' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:50 np0005593294 nova_compute[224687]: 2026-01-23 10:10:50.659 224691 DEBUG oslo_concurrency.processutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.823s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:50 np0005593294 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 05:10:50 np0005593294 systemd[1]: Started libvirt nodedev daemon.
Jan 23 05:10:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:50 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:50 np0005593294 python3.9[225419]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 05:10:50 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:10:50 np0005593294 nova_compute[224687]: 2026-01-23 10:10:50.973 224691 WARNING nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:10:50 np0005593294 nova_compute[224687]: 2026-01-23 10:10:50.974 224691 DEBUG nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5226MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:10:50 np0005593294 nova_compute[224687]: 2026-01-23 10:10:50.975 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:50 np0005593294 nova_compute[224687]: 2026-01-23 10:10:50.975 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:50 np0005593294 nova_compute[224687]: 2026-01-23 10:10:50.995 224691 WARNING nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] No compute node record for compute-1.ctlplane.example.com:b22b6ed5-7bca-42dc-9b99-6f2ad6853af7: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 could not be found.#033[00m
Jan 23 05:10:51 np0005593294 nova_compute[224687]: 2026-01-23 10:10:51.026 224691 INFO nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7#033[00m
Jan 23 05:10:51 np0005593294 nova_compute[224687]: 2026-01-23 10:10:51.080 224691 DEBUG nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:10:51 np0005593294 nova_compute[224687]: 2026-01-23 10:10:51.081 224691 DEBUG nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:10:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:51 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:51 np0005593294 nova_compute[224687]: 2026-01-23 10:10:51.618 224691 INFO nova.scheduler.client.report [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] [req-95542ec9-2546-4865-880f-0d0f3dd71826] Created resource provider record via placement API for resource provider with UUID b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 and name compute-1.ctlplane.example.com.#033[00m
Jan 23 05:10:51 np0005593294 nova_compute[224687]: 2026-01-23 10:10:51.641 224691 DEBUG oslo_concurrency.processutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:51 np0005593294 python3.9[225620]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:10:51 np0005593294 systemd[1]: Stopping nova_compute container...
Jan 23 05:10:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:51 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:52 np0005593294 nova_compute[224687]: 2026-01-23 10:10:52.104 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:52 np0005593294 nova_compute[224687]: 2026-01-23 10:10:52.105 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:52 np0005593294 nova_compute[224687]: 2026-01-23 10:10:52.106 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:52 np0005593294 nova_compute[224687]: 2026-01-23 10:10:52.106 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:52.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:52 np0005593294 systemd[1]: libpod-d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4.scope: Deactivated successfully.
Jan 23 05:10:52 np0005593294 systemd[1]: libpod-d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4.scope: Consumed 4.409s CPU time.
Jan 23 05:10:52 np0005593294 virtqemud[225011]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 23 05:10:52 np0005593294 podman[225644]: 2026-01-23 10:10:52.577787507 +0000 UTC m=+0.721428611 container died d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:10:52 np0005593294 virtqemud[225011]: hostname: compute-1
Jan 23 05:10:52 np0005593294 virtqemud[225011]: End of file while reading data: Input/output error
Jan 23 05:10:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:52.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:52 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:52 np0005593294 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4-userdata-shm.mount: Deactivated successfully.
Jan 23 05:10:52 np0005593294 systemd[1]: var-lib-containers-storage-overlay-4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576-merged.mount: Deactivated successfully.
Jan 23 05:10:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:53 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:53 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:54 np0005593294 podman[225644]: 2026-01-23 10:10:54.011476898 +0000 UTC m=+2.155117982 container cleanup d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:10:54 np0005593294 podman[225644]: nova_compute
Jan 23 05:10:54 np0005593294 podman[225676]: nova_compute
Jan 23 05:10:54 np0005593294 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 23 05:10:54 np0005593294 systemd[1]: Stopped nova_compute container.
Jan 23 05:10:54 np0005593294 systemd[1]: Starting nova_compute container...
Jan 23 05:10:54 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:10:54 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:54.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:54 np0005593294 podman[225689]: 2026-01-23 10:10:54.464315282 +0000 UTC m=+0.346538661 container init d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:10:54 np0005593294 podman[225689]: 2026-01-23 10:10:54.471283504 +0000 UTC m=+0.353506863 container start d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:10:54 np0005593294 nova_compute[225705]: + sudo -E kolla_set_configs
Jan 23 05:10:54 np0005593294 podman[225689]: nova_compute
Jan 23 05:10:54 np0005593294 systemd[1]: Started nova_compute container.
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Validating config file
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying service configuration files
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /etc/ceph
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Creating directory /etc/ceph
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Writing out command to execute
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:54 np0005593294 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 05:10:54 np0005593294 nova_compute[225705]: ++ cat /run_command
Jan 23 05:10:54 np0005593294 nova_compute[225705]: + CMD=nova-compute
Jan 23 05:10:54 np0005593294 nova_compute[225705]: + ARGS=
Jan 23 05:10:54 np0005593294 nova_compute[225705]: + sudo kolla_copy_cacerts
Jan 23 05:10:54 np0005593294 nova_compute[225705]: + [[ ! -n '' ]]
Jan 23 05:10:54 np0005593294 nova_compute[225705]: + . kolla_extend_start
Jan 23 05:10:54 np0005593294 nova_compute[225705]: Running command: 'nova-compute'
Jan 23 05:10:54 np0005593294 nova_compute[225705]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 05:10:54 np0005593294 nova_compute[225705]: + umask 0022
Jan 23 05:10:54 np0005593294 nova_compute[225705]: + exec nova-compute
Jan 23 05:10:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:54.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:54 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:10:55.037 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:10:55.038 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:10:55.038 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:55 np0005593294 python3.9[225868]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 05:10:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:55 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:55 np0005593294 systemd[1]: Started libpod-conmon-cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7.scope.
Jan 23 05:10:55 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:10:55 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cfec93db089755fabff22afa1c4174975aa597b02c6eb77e05b5113bcf3326b/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:55 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cfec93db089755fabff22afa1c4174975aa597b02c6eb77e05b5113bcf3326b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:55 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cfec93db089755fabff22afa1c4174975aa597b02c6eb77e05b5113bcf3326b/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:55 np0005593294 podman[225895]: 2026-01-23 10:10:55.585930976 +0000 UTC m=+0.136830794 container init cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:10:55 np0005593294 podman[225895]: 2026-01-23 10:10:55.59864789 +0000 UTC m=+0.149547678 container start cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 23 05:10:55 np0005593294 python3.9[225868]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Applying nova statedir ownership
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 23 05:10:55 np0005593294 nova_compute_init[225914]: INFO:nova_statedir:Nova statedir ownership complete
Jan 23 05:10:55 np0005593294 systemd[1]: libpod-cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7.scope: Deactivated successfully.
Jan 23 05:10:55 np0005593294 podman[225915]: 2026-01-23 10:10:55.702164726 +0000 UTC m=+0.037361307 container died cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:10:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:56 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:56 np0005593294 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7-userdata-shm.mount: Deactivated successfully.
Jan 23 05:10:56 np0005593294 systemd[1]: var-lib-containers-storage-overlay-3cfec93db089755fabff22afa1c4174975aa597b02c6eb77e05b5113bcf3326b-merged.mount: Deactivated successfully.
Jan 23 05:10:56 np0005593294 podman[225921]: 2026-01-23 10:10:56.199964148 +0000 UTC m=+0.511166707 container cleanup cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible)
Jan 23 05:10:56 np0005593294 systemd[1]: libpod-conmon-cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7.scope: Deactivated successfully.
Jan 23 05:10:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:56.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:10:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:56.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:56 np0005593294 systemd[1]: session-53.scope: Deactivated successfully.
Jan 23 05:10:56 np0005593294 systemd[1]: session-53.scope: Consumed 2min 4.413s CPU time.
Jan 23 05:10:56 np0005593294 systemd-logind[807]: Session 53 logged out. Waiting for processes to exit.
Jan 23 05:10:56 np0005593294 systemd-logind[807]: Removed session 53.
Jan 23 05:10:56 np0005593294 nova_compute[225705]: 2026-01-23 10:10:56.677 225709 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:56 np0005593294 nova_compute[225705]: 2026-01-23 10:10:56.677 225709 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:56 np0005593294 nova_compute[225705]: 2026-01-23 10:10:56.677 225709 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:56 np0005593294 nova_compute[225705]: 2026-01-23 10:10:56.678 225709 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 23 05:10:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:56 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:56 np0005593294 nova_compute[225705]: 2026-01-23 10:10:56.839 225709 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:56 np0005593294 nova_compute[225705]: 2026-01-23 10:10:56.862 225709 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:56 np0005593294 nova_compute[225705]: 2026-01-23 10:10:56.863 225709 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 05:10:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.298 225709 INFO nova.virt.driver [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 23 05:10:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:10:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.406 225709 INFO nova.compute.provider_config [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.415 225709 DEBUG oslo_concurrency.lockutils [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.415 225709 DEBUG oslo_concurrency.lockutils [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_concurrency.lockutils [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.423 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.423 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.423 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.423 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.424 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.424 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.424 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.424 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.429 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.429 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.429 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.435 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.435 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.435 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.461 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.461 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.461 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.461 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.466 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.466 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.466 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.466 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.475 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.475 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.475 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.475 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.478 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.478 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.478 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.478 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.482 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 WARNING oslo_config.cfg [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 05:10:57 np0005593294 nova_compute[225705]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 05:10:57 np0005593294 nova_compute[225705]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 05:10:57 np0005593294 nova_compute[225705]: and ``live_migration_inbound_addr`` respectively.
Jan 23 05:10:57 np0005593294 nova_compute[225705]: ).  Its value may be silently ignored in the future.#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_secret_uuid        = f3005f84-239a-55b6-a948-8f1fb592b920 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.536 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.536 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.536 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.536 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.546 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.546 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.546 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.546 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.547 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.547 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.547 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.547 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.563 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.563 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.563 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.563 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.571 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.571 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.572 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.573 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.573 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.573 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.573 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.575 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.575 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.575 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.575 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.576 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.576 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.576 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.576 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.579 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.579 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.579 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.579 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.581 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.581 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.581 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.581 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.582 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.582 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.582 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.583 225709 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 23 05:10:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.600 225709 INFO nova.virt.node [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Determined node identity b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from /var/lib/nova/compute_id#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.601 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.601 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.601 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.601 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.614 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7ff23e7c0970> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.617 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7ff23e7c0970> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.618 225709 INFO nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.625 225709 INFO nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <host>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <uuid>53821a39-1f4a-4bf2-b036-ba3044ea8780</uuid>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <arch>x86_64</arch>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <microcode version='16777317'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <signature family='23' model='49' stepping='0'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='x2apic'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='tsc-deadline'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='osxsave'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='hypervisor'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='tsc_adjust'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='spec-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='stibp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='arch-capabilities'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='cmp_legacy'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='topoext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='virt-ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='lbrv'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='tsc-scale'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='vmcb-clean'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='pause-filter'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='pfthreshold'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='rdctl-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='skip-l1dfl-vmentry'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='mds-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature name='pschange-mc-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <pages unit='KiB' size='4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <pages unit='KiB' size='2048'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <pages unit='KiB' size='1048576'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <power_management>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <suspend_mem/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </power_management>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <iommu support='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <migration_features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <live/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <uri_transports>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <uri_transport>tcp</uri_transport>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <uri_transport>rdma</uri_transport>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </uri_transports>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </migration_features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <topology>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <cells num='1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <cell id='0'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:          <memory unit='KiB'>7864316</memory>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:          <pages unit='KiB' size='4'>1966079</pages>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:          <pages unit='KiB' size='2048'>0</pages>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:          <distances>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:            <sibling id='0' value='10'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:          </distances>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:          <cpus num='8'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:          </cpus>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        </cell>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </cells>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </topology>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <cache>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </cache>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <secmodel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model>selinux</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <doi>0</doi>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </secmodel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <secmodel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model>dac</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <doi>0</doi>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </secmodel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </host>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <guest>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <os_type>hvm</os_type>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <arch name='i686'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <wordsize>32</wordsize>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <domain type='qemu'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <domain type='kvm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </arch>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <pae/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <nonpae/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <acpi default='on' toggle='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <apic default='on' toggle='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <cpuselection/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <deviceboot/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <disksnapshot default='on' toggle='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <externalSnapshot/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </guest>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <guest>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <os_type>hvm</os_type>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <arch name='x86_64'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <wordsize>64</wordsize>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <domain type='qemu'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <domain type='kvm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </arch>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <acpi default='on' toggle='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <apic default='on' toggle='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <cpuselection/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <deviceboot/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <disksnapshot default='on' toggle='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <externalSnapshot/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </guest>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 
Jan 23 05:10:57 np0005593294 nova_compute[225705]: </capabilities>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: #033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.632 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.635 225709 DEBUG nova.virt.libvirt.volume.mount [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.637 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 05:10:57 np0005593294 nova_compute[225705]: <domainCapabilities>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <domain>kvm</domain>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <arch>i686</arch>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <vcpu max='4096'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <iothreads supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <os supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <enum name='firmware'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <loader supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>rom</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pflash</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='readonly'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>yes</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>no</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='secure'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>no</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </loader>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>on</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>off</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='maximumMigratable'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>on</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>off</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='succor'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='custom' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='KnightsMill'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='athlon'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='athlon-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='core2duo'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='core2duo-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='coreduo'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='coreduo-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='n270'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='n270-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='phenom'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='phenom-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <memoryBacking supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <enum name='sourceType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>file</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>anonymous</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>memfd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </memoryBacking>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <disk supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='diskDevice'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>disk</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>cdrom</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>floppy</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>lun</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='bus'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>fdc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>scsi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>sata</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <graphics supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vnc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>egl-headless</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dbus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <video supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='modelType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vga</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>cirrus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>none</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>bochs</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>ramfb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <hostdev supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='mode'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>subsystem</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='startupPolicy'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>default</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>mandatory</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>requisite</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>optional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='subsysType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pci</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>scsi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='capsType'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='pciBackend'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </hostdev>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <rng supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>random</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>egd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>builtin</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <filesystem supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='driverType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>path</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>handle</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtiofs</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </filesystem>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <tpm supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tpm-tis</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tpm-crb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>emulator</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>external</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendVersion'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>2.0</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </tpm>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <redirdev supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='bus'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </redirdev>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <channel supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pty</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>unix</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </channel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <crypto supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>qemu</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>builtin</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </crypto>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <interface supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>default</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>passt</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <panic supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>isa</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>hyperv</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </panic>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <console supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>null</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pty</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dev</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>file</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pipe</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>stdio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>udp</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tcp</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>unix</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>qemu-vdagent</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dbus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <gic supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <genid supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <backup supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <async-teardown supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <s390-pv supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <ps2 supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <tdx supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <sev supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <sgx supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <hyperv supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='features'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>relaxed</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vapic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>spinlocks</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vpindex</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>runtime</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>synic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>stimer</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>reset</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vendor_id</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>frequencies</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>reenlightenment</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tlbflush</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>ipi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>avic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>emsr_bitmap</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>xmm_input</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <defaults>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </defaults>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </hyperv>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <launchSecurity supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: </domainCapabilities>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.647 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 05:10:57 np0005593294 nova_compute[225705]: <domainCapabilities>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <domain>kvm</domain>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <arch>i686</arch>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <vcpu max='240'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <iothreads supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <os supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <enum name='firmware'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <loader supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>rom</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pflash</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='readonly'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>yes</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>no</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='secure'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>no</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </loader>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>on</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>off</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='maximumMigratable'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>on</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>off</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='succor'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='custom' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='KnightsMill'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='athlon'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='athlon-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='core2duo'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='core2duo-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='coreduo'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='coreduo-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='n270'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='n270-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='phenom'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='phenom-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <memoryBacking supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <enum name='sourceType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>file</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>anonymous</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>memfd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </memoryBacking>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <disk supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='diskDevice'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>disk</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>cdrom</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>floppy</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>lun</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='bus'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>ide</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>fdc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>scsi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>sata</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <graphics supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vnc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>egl-headless</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dbus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <video supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='modelType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vga</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>cirrus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>none</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>bochs</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>ramfb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <hostdev supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='mode'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>subsystem</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='startupPolicy'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>default</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>mandatory</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>requisite</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>optional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='subsysType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pci</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>scsi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='capsType'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='pciBackend'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </hostdev>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <rng supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>random</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>egd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>builtin</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <filesystem supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='driverType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>path</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>handle</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtiofs</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </filesystem>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <tpm supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tpm-tis</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tpm-crb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>emulator</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>external</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendVersion'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>2.0</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </tpm>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <redirdev supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='bus'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </redirdev>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <channel supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pty</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>unix</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </channel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <crypto supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>qemu</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>builtin</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </crypto>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <interface supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>default</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>passt</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <panic supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>isa</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>hyperv</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </panic>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <console supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>null</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pty</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dev</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>file</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pipe</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>stdio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>udp</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tcp</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>unix</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>qemu-vdagent</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dbus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <gic supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <genid supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <backup supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <async-teardown supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <s390-pv supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <ps2 supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <tdx supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <sev supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <sgx supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <hyperv supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='features'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>relaxed</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vapic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>spinlocks</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vpindex</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>runtime</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>synic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>stimer</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>reset</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vendor_id</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>frequencies</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>reenlightenment</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tlbflush</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>ipi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>avic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>emsr_bitmap</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>xmm_input</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <defaults>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </defaults>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </hyperv>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <launchSecurity supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: </domainCapabilities>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.698 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.704 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 05:10:57 np0005593294 nova_compute[225705]: <domainCapabilities>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <domain>kvm</domain>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <arch>x86_64</arch>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <vcpu max='4096'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <iothreads supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <os supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <enum name='firmware'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>efi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <loader supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>rom</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pflash</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='readonly'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>yes</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>no</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='secure'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>yes</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>no</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </loader>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>on</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>off</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='maximumMigratable'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>on</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>off</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='succor'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='custom' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='KnightsMill'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='athlon'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='athlon-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='core2duo'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='core2duo-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='coreduo'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='coreduo-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='n270'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='n270-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='phenom'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='phenom-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <memoryBacking supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <enum name='sourceType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>file</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>anonymous</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>memfd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </memoryBacking>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <disk supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='diskDevice'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>disk</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>cdrom</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>floppy</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>lun</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='bus'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>fdc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>scsi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>sata</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <graphics supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vnc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>egl-headless</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dbus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <video supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='modelType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vga</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>cirrus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>none</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>bochs</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>ramfb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <hostdev supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='mode'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>subsystem</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='startupPolicy'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>default</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>mandatory</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>requisite</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>optional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='subsysType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pci</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>scsi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='capsType'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='pciBackend'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </hostdev>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <rng supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>random</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>egd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>builtin</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <filesystem supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='driverType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>path</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>handle</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtiofs</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </filesystem>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <tpm supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tpm-tis</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tpm-crb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>emulator</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>external</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendVersion'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>2.0</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </tpm>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <redirdev supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='bus'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </redirdev>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <channel supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pty</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>unix</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </channel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <crypto supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>qemu</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>builtin</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </crypto>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <interface supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>default</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>passt</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <panic supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>isa</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>hyperv</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </panic>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <console supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>null</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pty</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dev</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>file</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pipe</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>stdio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>udp</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tcp</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>unix</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>qemu-vdagent</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dbus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <gic supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <genid supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <backup supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <async-teardown supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <s390-pv supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <ps2 supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <tdx supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <sev supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <sgx supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <hyperv supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='features'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>relaxed</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vapic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>spinlocks</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vpindex</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>runtime</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>synic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>stimer</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>reset</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vendor_id</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>frequencies</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>reenlightenment</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tlbflush</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>ipi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>avic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>emsr_bitmap</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>xmm_input</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <defaults>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </defaults>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </hyperv>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <launchSecurity supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: </domainCapabilities>
Jan 23 05:10:57 np0005593294 nova_compute[225705]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:57 np0005593294 nova_compute[225705]: 2026-01-23 10:10:57.793 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 05:10:57 np0005593294 nova_compute[225705]: <domainCapabilities>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <domain>kvm</domain>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <arch>x86_64</arch>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <vcpu max='240'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <iothreads supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <os supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <enum name='firmware'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <loader supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>rom</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pflash</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='readonly'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>yes</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>no</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='secure'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>no</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </loader>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>on</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>off</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='maximumMigratable'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>on</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>off</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='succor'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <mode name='custom' supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Denverton-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='EPYC-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Haswell-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='KnightsMill'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='athlon'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='athlon-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='core2duo'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='core2duo-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='coreduo'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='coreduo-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='n270'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='n270-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='phenom'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <blockers model='phenom-v1'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </blockers>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </mode>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <memoryBacking supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <enum name='sourceType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>file</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>anonymous</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <value>memfd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </memoryBacking>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <disk supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='diskDevice'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>disk</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>cdrom</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>floppy</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>lun</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='bus'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>ide</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>fdc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>scsi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>sata</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <graphics supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vnc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>egl-headless</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dbus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <video supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='modelType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vga</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>cirrus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>none</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>bochs</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>ramfb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <hostdev supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='mode'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>subsystem</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='startupPolicy'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>default</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>mandatory</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>requisite</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>optional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='subsysType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pci</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>scsi</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='capsType'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='pciBackend'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </hostdev>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <rng supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>random</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>egd</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>builtin</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <filesystem supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='driverType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>path</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>handle</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>virtiofs</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </filesystem>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <tpm supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tpm-tis</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tpm-crb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>emulator</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>external</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendVersion'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>2.0</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </tpm>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <redirdev supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='bus'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>usb</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </redirdev>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <channel supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pty</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>unix</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </channel>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <crypto supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>qemu</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>builtin</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </crypto>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <interface supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='backendType'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>default</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>passt</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <panic supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='model'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>isa</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>hyperv</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </panic>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <console supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='type'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>null</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vc</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pty</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dev</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>file</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>pipe</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>stdio</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>udp</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>tcp</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>unix</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>qemu-vdagent</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>dbus</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      </enum>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <gic supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <genid supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <backup supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <async-teardown supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <s390-pv supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <ps2 supported='yes'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <tdx supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <sev supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <sgx supported='no'/>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:    <hyperv supported='yes'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:      <enum name='features'>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>relaxed</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vapic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>spinlocks</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vpindex</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>runtime</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>synic</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>stimer</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>reset</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>vendor_id</value>
Jan 23 05:10:57 np0005593294 nova_compute[225705]:        <value>frequencies</value>
Jan 23 05:18:28 np0005593294 nova_compute[225705]: 2026-01-23 10:18:28.618 229713 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 05:18:28 np0005593294 nova_compute[225705]: 2026-01-23 10:18:28.622 229713 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 05:18:28 np0005593294 nova_compute[225705]: 2026-01-23 10:18:28.625 229713 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Jan 23 05:18:28 np0005593294 nova_compute[225705]: 2026-01-23 10:18:28.625 229713 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229713#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.062 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:29 np0005593294 rsyslogd[1006]: imjournal: 3618 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.064 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape056b1c4-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.064 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape056b1c4-d8, col_values=(('external_ids', {'iface-id': 'e056b1c4-d8ee-40be-ab65-dad6851e9340', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:a1:b7', 'vm-uuid': 'ed3c80d1-b549-49d1-be66-00467e195256'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.066 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:29 np0005593294 NetworkManager[48978]: <info>  [1769163509.0678] manager: (tape056b1c4-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.068 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.078 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.079 225709 INFO os_vif [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8')#033[00m
Jan 23 05:18:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.127 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.127 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.127 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:42:a1:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.128 225709 INFO nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Using config drive#033[00m
Jan 23 05:18:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101829 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.155 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.396 225709 INFO nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Creating config drive at /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.404 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp27pbnrm5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.534 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp27pbnrm5" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.561 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:18:29 np0005593294 nova_compute[225705]: 2026-01-23 10:18:29.565 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config ed3c80d1-b549-49d1-be66-00467e195256_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:29.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:29.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:30 np0005593294 nova_compute[225705]: 2026-01-23 10:18:30.738 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config ed3c80d1-b549-49d1-be66-00467e195256_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:30 np0005593294 nova_compute[225705]: 2026-01-23 10:18:30.739 225709 INFO nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Deleting local config drive /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config because it was imported into RBD.#033[00m
Jan 23 05:18:30 np0005593294 systemd[1]: Starting libvirt secret daemon...
Jan 23 05:18:30 np0005593294 systemd[1]: Started libvirt secret daemon.
Jan 23 05:18:30 np0005593294 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 23 05:18:30 np0005593294 kernel: tape056b1c4-d8: entered promiscuous mode
Jan 23 05:18:30 np0005593294 NetworkManager[48978]: <info>  [1769163510.8844] manager: (tape056b1c4-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 23 05:18:30 np0005593294 ovn_controller[133293]: 2026-01-23T10:18:30Z|00027|binding|INFO|Claiming lport e056b1c4-d8ee-40be-ab65-dad6851e9340 for this chassis.
Jan 23 05:18:30 np0005593294 ovn_controller[133293]: 2026-01-23T10:18:30Z|00028|binding|INFO|e056b1c4-d8ee-40be-ab65-dad6851e9340: Claiming fa:16:3e:42:a1:b7 10.100.0.13
Jan 23 05:18:30 np0005593294 nova_compute[225705]: 2026-01-23 10:18:30.887 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:30 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:30.907 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:a1:b7 10.100.0.13'], port_security=['fa:16:3e:42:a1:b7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ed3c80d1-b549-49d1-be66-00467e195256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '96259b98-6654-41f6-bfeb-290c4063344e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93789b9e-064c-44b7-b00b-f52ca7e4569d, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=e056b1c4-d8ee-40be-ab65-dad6851e9340) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:18:30 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:30.909 143098 INFO neutron.agent.ovn.metadata.agent [-] Port e056b1c4-d8ee-40be-ab65-dad6851e9340 in datapath 4f467dc5-4a9f-42dc-990e-a2a671c8b09c bound to our chassis#033[00m
Jan 23 05:18:30 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:30.911 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4f467dc5-4a9f-42dc-990e-a2a671c8b09c#033[00m
Jan 23 05:18:30 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:30.912 143098 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpwhp69nko/privsep.sock']#033[00m
Jan 23 05:18:30 np0005593294 systemd-udevd[229839]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:18:30 np0005593294 systemd-machined[194551]: New machine qemu-1-instance-00000003.
Jan 23 05:18:30 np0005593294 NetworkManager[48978]: <info>  [1769163510.9580] device (tape056b1c4-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:18:30 np0005593294 NetworkManager[48978]: <info>  [1769163510.9593] device (tape056b1c4-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:18:30 np0005593294 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Jan 23 05:18:30 np0005593294 nova_compute[225705]: 2026-01-23 10:18:30.981 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:30 np0005593294 ovn_controller[133293]: 2026-01-23T10:18:30Z|00029|binding|INFO|Setting lport e056b1c4-d8ee-40be-ab65-dad6851e9340 ovn-installed in OVS
Jan 23 05:18:30 np0005593294 ovn_controller[133293]: 2026-01-23T10:18:30Z|00030|binding|INFO|Setting lport e056b1c4-d8ee-40be-ab65-dad6851e9340 up in Southbound
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:30.998 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:31 np0005593294 podman[229778]: 2026-01-23 10:18:31.01057181 +0000 UTC m=+0.220245120 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:18:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.172 225709 DEBUG nova.compute.manager [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.172 225709 DEBUG oslo_concurrency.lockutils [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.172 225709 DEBUG oslo_concurrency.lockutils [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.173 225709 DEBUG oslo_concurrency.lockutils [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.173 225709 DEBUG nova.compute.manager [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Processing event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.248 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.487 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163511.4864285, ed3c80d1-b549-49d1-be66-00467e195256 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.488 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] VM Started (Lifecycle Event)#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.491 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.496 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.501 225709 INFO nova.virt.libvirt.driver [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Instance spawned successfully.#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.502 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.512 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.520 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.525 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.526 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.526 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.527 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.527 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.527 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.558 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.559 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163511.4866998, ed3c80d1-b549-49d1-be66-00467e195256 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.559 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.599 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.603 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163511.4953184, ed3c80d1-b549-49d1-be66-00467e195256 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.603 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.607 225709 INFO nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Took 8.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.608 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.618 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.623 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.650 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.670 225709 INFO nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Took 9.45 seconds to build instance.#033[00m
Jan 23 05:18:31 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.678 143098 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 05:18:31 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.680 143098 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpwhp69nko/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 23 05:18:31 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.540 229898 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 05:18:31 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.545 229898 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 05:18:31 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.546 229898 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Jan 23 05:18:31 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.547 229898 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229898#033[00m
Jan 23 05:18:31 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.683 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[9726addb-4939-421e-90f0-82628f34a560]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:31 np0005593294 nova_compute[225705]: 2026-01-23 10:18:31.686 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:31.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:31.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.251 229898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.251 229898 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.251 229898 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.855 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d75731-3b88-4dd3-83a7-14415bdb0f31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.856 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4f467dc5-41 in ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.858 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4f467dc5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.858 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[97e83ca2-e11c-4412-8c8c-c416fd3c6d51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.860 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[47ce72c7-1cf9-452a-99b7-032493d11143]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.884 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[08aaa96c-97cf-4938-bb74-28c75ee08873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.903 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[610e3f31-122a-4eab-8965-7b3d134f4190]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.907 143098 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp2keu1f1z/privsep.sock']#033[00m
Jan 23 05:18:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:33 np0005593294 nova_compute[225705]: 2026-01-23 10:18:33.278 225709 DEBUG nova.compute.manager [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:33 np0005593294 nova_compute[225705]: 2026-01-23 10:18:33.279 225709 DEBUG oslo_concurrency.lockutils [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:33 np0005593294 nova_compute[225705]: 2026-01-23 10:18:33.279 225709 DEBUG oslo_concurrency.lockutils [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:33 np0005593294 nova_compute[225705]: 2026-01-23 10:18:33.279 225709 DEBUG oslo_concurrency.lockutils [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:33 np0005593294 nova_compute[225705]: 2026-01-23 10:18:33.279 225709 DEBUG nova.compute.manager [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:18:33 np0005593294 nova_compute[225705]: 2026-01-23 10:18:33.280 225709 WARNING nova.compute.manager [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:18:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:33 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.707 143098 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 05:18:33 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.708 143098 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp2keu1f1z/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 23 05:18:33 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.606 229913 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 05:18:33 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.609 229913 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 05:18:33 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.611 229913 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 23 05:18:33 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.611 229913 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229913#033[00m
Jan 23 05:18:33 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.711 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[22de4e2c-2057-4577-810e-6506987d6680]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:33.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:33.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:34 np0005593294 nova_compute[225705]: 2026-01-23 10:18:34.067 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.199 229913 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.199 229913 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.199 229913 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.791 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[00b4d305-dbd0-4555-bb80-fa7032b7d712]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:34 np0005593294 NetworkManager[48978]: <info>  [1769163514.8100] manager: (tap4f467dc5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.809 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb72c89-c3f6-4a83-aad1-8b88de714140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:34 np0005593294 systemd-udevd[229925]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.839 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[e793fa8b-3acc-416c-84f4-ec770a1b48d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.845 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a0221d-4693-4d6f-8845-1482e0f2d80f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:34 np0005593294 NetworkManager[48978]: <info>  [1769163514.8696] device (tap4f467dc5-40): carrier: link connected
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.874 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[384fe701-baf2-491a-b10e-7e1c6c20b770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.895 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[54fe8993-7ca8-45e2-89bc-7b30371bc522]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f467dc5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9b:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466885, 'reachable_time': 23089, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229943, 'error': None, 'target': 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.912 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[374b6fe3-c774-4db1-9c39-056435abf99b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:9bc5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466885, 'tstamp': 466885}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229944, 'error': None, 'target': 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.929 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[14c12a87-50e2-4721-8078-ec197b674db7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f467dc5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9b:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466885, 'reachable_time': 23089, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229945, 'error': None, 'target': 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:34 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.959 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[9a8dda72-a2b7-4d2d-921f-68950dd28b8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.013 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[70e784e0-0f6a-466b-830b-e3c6ad81aaeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.015 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f467dc5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.015 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.016 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f467dc5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:35 np0005593294 nova_compute[225705]: 2026-01-23 10:18:35.057 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593294 kernel: tap4f467dc5-40: entered promiscuous mode
Jan 23 05:18:35 np0005593294 NetworkManager[48978]: <info>  [1769163515.0638] manager: (tap4f467dc5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 23 05:18:35 np0005593294 nova_compute[225705]: 2026-01-23 10:18:35.064 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.065 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4f467dc5-40, col_values=(('external_ids', {'iface-id': '572285ac-9ff4-42d8-9b72-b5588035f74c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:35 np0005593294 nova_compute[225705]: 2026-01-23 10:18:35.066 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593294 ovn_controller[133293]: 2026-01-23T10:18:35Z|00031|binding|INFO|Releasing lport 572285ac-9ff4-42d8-9b72-b5588035f74c from this chassis (sb_readonly=0)
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.070 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4f467dc5-4a9f-42dc-990e-a2a671c8b09c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4f467dc5-4a9f-42dc-990e-a2a671c8b09c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.071 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[6d623c4d-f850-42ef-b902-8ccaeac0642e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.073 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: global
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    log         /dev/log local0 debug
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    log-tag     haproxy-metadata-proxy-4f467dc5-4a9f-42dc-990e-a2a671c8b09c
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    user        root
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    group       root
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    maxconn     1024
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    pidfile     /var/lib/neutron/external/pids/4f467dc5-4a9f-42dc-990e-a2a671c8b09c.pid.haproxy
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    daemon
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: defaults
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    log global
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    mode http
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    option httplog
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    option dontlognull
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    option http-server-close
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    option forwardfor
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    retries                 3
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    timeout http-request    30s
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    timeout connect         30s
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    timeout client          32s
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    timeout server          32s
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    timeout http-keep-alive 30s
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: listen listener
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    bind 169.254.169.254:80
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]:    http-request add-header X-OVN-Network-ID 4f467dc5-4a9f-42dc-990e-a2a671c8b09c
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:18:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.074 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'env', 'PROCESS_TAG=haproxy-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4f467dc5-4a9f-42dc-990e-a2a671c8b09c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:18:35 np0005593294 nova_compute[225705]: 2026-01-23 10:18:35.082 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:35 np0005593294 podman[229978]: 2026-01-23 10:18:35.46865314 +0000 UTC m=+0.058749877 container create d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:18:35 np0005593294 systemd[1]: Started libpod-conmon-d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f.scope.
Jan 23 05:18:35 np0005593294 podman[229978]: 2026-01-23 10:18:35.439739736 +0000 UTC m=+0.029836503 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:18:35 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:18:35 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38a430a453066cd300215ffab9c681910b2ee216372ea5d2773756ffea2ac606/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:18:35 np0005593294 podman[229978]: 2026-01-23 10:18:35.559752599 +0000 UTC m=+0.149849336 container init d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:18:35 np0005593294 podman[229978]: 2026-01-23 10:18:35.566614376 +0000 UTC m=+0.156711113 container start d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 05:18:35 np0005593294 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [NOTICE]   (229998) : New worker (230000) forked
Jan 23 05:18:35 np0005593294 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [NOTICE]   (229998) : Loading success.
Jan 23 05:18:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:35.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:35.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:36 np0005593294 nova_compute[225705]: 2026-01-23 10:18:36.282 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:37.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:37.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <info>  [1769163518.5508] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Jan 23 05:18:38 np0005593294 ovn_controller[133293]: 2026-01-23T10:18:38Z|00032|binding|INFO|Releasing lport 572285ac-9ff4-42d8-9b72-b5588035f74c from this chassis (sb_readonly=0)
Jan 23 05:18:38 np0005593294 nova_compute[225705]: 2026-01-23 10:18:38.551 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <info>  [1769163518.5531] device (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <warn>  [1769163518.5535] device (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <info>  [1769163518.5552] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <info>  [1769163518.5558] device (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <warn>  [1769163518.5559] device (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <info>  [1769163518.5572] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <info>  [1769163518.5582] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <info>  [1769163518.5590] device (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 05:18:38 np0005593294 NetworkManager[48978]: <info>  [1769163518.5596] device (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 05:18:38 np0005593294 ovn_controller[133293]: 2026-01-23T10:18:38Z|00033|binding|INFO|Releasing lport 572285ac-9ff4-42d8-9b72-b5588035f74c from this chassis (sb_readonly=0)
Jan 23 05:18:38 np0005593294 nova_compute[225705]: 2026-01-23 10:18:38.585 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:38 np0005593294 nova_compute[225705]: 2026-01-23 10:18:38.590 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:38 np0005593294 nova_compute[225705]: 2026-01-23 10:18:38.760 225709 DEBUG nova.compute.manager [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:38 np0005593294 nova_compute[225705]: 2026-01-23 10:18:38.760 225709 DEBUG nova.compute.manager [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing instance network info cache due to event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:18:38 np0005593294 nova_compute[225705]: 2026-01-23 10:18:38.760 225709 DEBUG oslo_concurrency.lockutils [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:18:38 np0005593294 nova_compute[225705]: 2026-01-23 10:18:38.760 225709 DEBUG oslo_concurrency.lockutils [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:18:38 np0005593294 nova_compute[225705]: 2026-01-23 10:18:38.761 225709 DEBUG nova.network.neutron [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:18:39 np0005593294 nova_compute[225705]: 2026-01-23 10:18:39.069 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:39.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:39.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:40 np0005593294 nova_compute[225705]: 2026-01-23 10:18:40.587 225709 DEBUG nova.network.neutron [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updated VIF entry in instance network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:18:40 np0005593294 nova_compute[225705]: 2026-01-23 10:18:40.588 225709 DEBUG nova.network.neutron [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:18:40 np0005593294 nova_compute[225705]: 2026-01-23 10:18:40.605 225709 DEBUG oslo_concurrency.lockutils [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:18:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:41 np0005593294 nova_compute[225705]: 2026-01-23 10:18:41.284 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:41.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:18:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:41.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:18:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:43 np0005593294 podman[230016]: 2026-01-23 10:18:43.688418712 +0000 UTC m=+0.082679693 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:18:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:43.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:43.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:44 np0005593294 nova_compute[225705]: 2026-01-23 10:18:44.072 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:45 np0005593294 ovn_controller[133293]: 2026-01-23T10:18:45Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:a1:b7 10.100.0.13
Jan 23 05:18:45 np0005593294 ovn_controller[133293]: 2026-01-23T10:18:45Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:a1:b7 10.100.0.13
Jan 23 05:18:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:45.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:45.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:46 np0005593294 nova_compute[225705]: 2026-01-23 10:18:46.286 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101847 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 116ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:18:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:18:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:47.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:18:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:47.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:49 np0005593294 nova_compute[225705]: 2026-01-23 10:18:49.082 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:49.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:49.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:51 np0005593294 nova_compute[225705]: 2026-01-23 10:18:51.310 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:51 np0005593294 nova_compute[225705]: 2026-01-23 10:18:51.545 225709 INFO nova.compute.manager [None req-3b9a1359-602c-4d0e-92e3-2c69b939e4b0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Get console output#033[00m
Jan 23 05:18:51 np0005593294 nova_compute[225705]: 2026-01-23 10:18:51.550 225709 INFO oslo.privsep.daemon [None req-3b9a1359-602c-4d0e-92e3-2c69b939e4b0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp52lkgdah/privsep.sock']#033[00m
Jan 23 05:18:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:51.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:51.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:52 np0005593294 nova_compute[225705]: 2026-01-23 10:18:52.252 225709 INFO oslo.privsep.daemon [None req-3b9a1359-602c-4d0e-92e3-2c69b939e4b0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 05:18:52 np0005593294 nova_compute[225705]: 2026-01-23 10:18:52.099 230072 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 05:18:52 np0005593294 nova_compute[225705]: 2026-01-23 10:18:52.104 230072 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 05:18:52 np0005593294 nova_compute[225705]: 2026-01-23 10:18:52.106 230072 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 23 05:18:52 np0005593294 nova_compute[225705]: 2026-01-23 10:18:52.106 230072 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230072#033[00m
Jan 23 05:18:52 np0005593294 nova_compute[225705]: 2026-01-23 10:18:52.348 230072 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:18:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040008d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:53.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:53.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:54 np0005593294 nova_compute[225705]: 2026-01-23 10:18:54.084 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:55.047 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:55.047 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:18:55.048 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040008d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:55.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:55.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:56 np0005593294 nova_compute[225705]: 2026-01-23 10:18:56.311 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:18:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:57.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:57.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002120 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.086 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.217 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.218 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.219 225709 DEBUG nova.objects.instance [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'flavor' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.567 225709 DEBUG nova.objects.instance [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_requests' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.587 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.794 225709 DEBUG nova.policy [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:18:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:59.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:18:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.868 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:18:59 np0005593294 nova_compute[225705]: 2026-01-23 10:18:59.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:18:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:18:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:59.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:00 np0005593294 nova_compute[225705]: 2026-01-23 10:19:00.362 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:00 np0005593294 nova_compute[225705]: 2026-01-23 10:19:00.363 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:00 np0005593294 nova_compute[225705]: 2026-01-23 10:19:00.363 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:19:00 np0005593294 nova_compute[225705]: 2026-01-23 10:19:00.363 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002120 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:01 np0005593294 nova_compute[225705]: 2026-01-23 10:19:01.313 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:01 np0005593294 nova_compute[225705]: 2026-01-23 10:19:01.357 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Successfully created port: 35c98901-92ff-40ab-a9c4-0da34169949c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:19:01 np0005593294 podman[230079]: 2026-01-23 10:19:01.737567892 +0000 UTC m=+0.134832442 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:19:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:01.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:01.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.105 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.121 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.121 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.122 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.305 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Successfully updated port: 35c98901-92ff-40ab-a9c4-0da34169949c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.322 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.322 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.322 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.423 225709 DEBUG nova.compute.manager [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-changed-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.424 225709 DEBUG nova.compute.manager [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing instance network info cache due to event network-changed-35c98901-92ff-40ab-a9c4-0da34169949c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.424 225709 DEBUG oslo_concurrency.lockutils [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:02 np0005593294 nova_compute[225705]: 2026-01-23 10:19:02.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:19:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:03.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:03 np0005593294 nova_compute[225705]: 2026-01-23 10:19:03.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:03 np0005593294 nova_compute[225705]: 2026-01-23 10:19:03.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:03 np0005593294 nova_compute[225705]: 2026-01-23 10:19:03.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:19:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:03.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.088 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.857 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.894 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.896 225709 DEBUG oslo_concurrency.lockutils [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.896 225709 DEBUG nova.network.neutron [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing network info cache for port 35c98901-92ff-40ab-a9c4-0da34169949c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.899 225709 DEBUG nova.virt.libvirt.vif [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.900 225709 DEBUG nova.network.os_vif_util [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.901 225709 DEBUG nova.network.os_vif_util [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.901 225709 DEBUG os_vif [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.902 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.902 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.903 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.907 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.907 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap35c98901-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.908 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap35c98901-92, col_values=(('external_ids', {'iface-id': '35c98901-92ff-40ab-a9c4-0da34169949c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4c:1e:6d', 'vm-uuid': 'ed3c80d1-b549-49d1-be66-00467e195256'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.909 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:04 np0005593294 NetworkManager[48978]: <info>  [1769163544.9102] manager: (tap35c98901-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.913 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.920 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.921 225709 INFO os_vif [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92')#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.922 225709 DEBUG nova.virt.libvirt.vif [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.923 225709 DEBUG nova.network.os_vif_util [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.924 225709 DEBUG nova.network.os_vif_util [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.928 225709 DEBUG nova.virt.libvirt.guest [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] attach device xml: <interface type="ethernet">
Jan 23 05:19:04 np0005593294 nova_compute[225705]:  <mac address="fa:16:3e:4c:1e:6d"/>
Jan 23 05:19:04 np0005593294 nova_compute[225705]:  <model type="virtio"/>
Jan 23 05:19:04 np0005593294 nova_compute[225705]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:19:04 np0005593294 nova_compute[225705]:  <mtu size="1442"/>
Jan 23 05:19:04 np0005593294 nova_compute[225705]:  <target dev="tap35c98901-92"/>
Jan 23 05:19:04 np0005593294 nova_compute[225705]: </interface>
Jan 23 05:19:04 np0005593294 nova_compute[225705]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:19:04 np0005593294 kernel: tap35c98901-92: entered promiscuous mode
Jan 23 05:19:04 np0005593294 NetworkManager[48978]: <info>  [1769163544.9442] manager: (tap35c98901-92): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 23 05:19:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:04Z|00034|binding|INFO|Claiming lport 35c98901-92ff-40ab-a9c4-0da34169949c for this chassis.
Jan 23 05:19:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:04Z|00035|binding|INFO|35c98901-92ff-40ab-a9c4-0da34169949c: Claiming fa:16:3e:4c:1e:6d 10.100.0.26
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.945 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.961 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:1e:6d 10.100.0.26'], port_security=['fa:16:3e:4c:1e:6d 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'ed3c80d1-b549-49d1-be66-00467e195256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b2d21e8-4b70-4725-bde5-4813c876e6bd, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=35c98901-92ff-40ab-a9c4-0da34169949c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.964 143098 INFO neutron.agent.ovn.metadata.agent [-] Port 35c98901-92ff-40ab-a9c4-0da34169949c in datapath 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a bound to our chassis#033[00m
Jan 23 05:19:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.966 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a#033[00m
Jan 23 05:19:04 np0005593294 systemd-udevd[230139]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:19:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.985 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[cd5d1aab-1ab5-4069-906e-cd1b1e402ac2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.986 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5c9ea62d-41 in ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.988 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.988 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5c9ea62d-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:19:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.989 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[87d4efa6-a638-44bf-b4d6-279470ea7838]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:04Z|00036|binding|INFO|Setting lport 35c98901-92ff-40ab-a9c4-0da34169949c ovn-installed in OVS
Jan 23 05:19:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:04Z|00037|binding|INFO|Setting lport 35c98901-92ff-40ab-a9c4-0da34169949c up in Southbound
Jan 23 05:19:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.990 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a4e7bd-e8cd-4dd4-84f8-c7ba528b0844]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:04 np0005593294 nova_compute[225705]: 2026-01-23 10:19:04.991 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:05 np0005593294 NetworkManager[48978]: <info>  [1769163544.9994] device (tap35c98901-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:19:05 np0005593294 NetworkManager[48978]: <info>  [1769163544.9999] device (tap35c98901-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.018 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[8a876276-2179-4b3b-9cba-07303de0f6f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.041 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[83146e2c-bcfe-4615-bed5-57f9940624c9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.050 225709 DEBUG nova.virt.libvirt.driver [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.051 225709 DEBUG nova.virt.libvirt.driver [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.051 225709 DEBUG nova.virt.libvirt.driver [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:42:a1:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.051 225709 DEBUG nova.virt.libvirt.driver [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:4c:1e:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.067 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3c3211-2293-4025-a7ca-fe04e78ddbea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 NetworkManager[48978]: <info>  [1769163545.0728] manager: (tap5c9ea62d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.072 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[539a0be0-e3d0-415f-b82e-234047e76374]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.084 225709 DEBUG nova.virt.libvirt.guest [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:19:05</nova:creationTime>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 05:19:05 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    <nova:port uuid="35c98901-92ff-40ab-a9c4-0da34169949c">
Jan 23 05:19:05 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:05 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:19:05 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:19:05 np0005593294 nova_compute[225705]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.107 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cf3322-0615-4ff4-be9e-8dd565510322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.112 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[77721f7d-3e1a-4d62-974d-c43e9cda8e3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.119 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:05 np0005593294 NetworkManager[48978]: <info>  [1769163545.1388] device (tap5c9ea62d-40): carrier: link connected
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.147 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c25bba8e-36c2-45c7-8b45-95ca39e446e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.165 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0f1e6161-9b18-49ba-a525-198c765f8d65]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c9ea62d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:ca:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469912, 'reachable_time': 39567, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230166, 'error': None, 'target': 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.183 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[36412169-3cdb-46b0-93c2-de772429993e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:caf4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469912, 'tstamp': 469912}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230167, 'error': None, 'target': 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.201 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[be23fea0-529d-4ebe-a181-54b0d623cb6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c9ea62d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:ca:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469912, 'reachable_time': 39567, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230168, 'error': None, 'target': 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.232 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[94e12952-adca-4dc7-8957-344cddc7c9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.288 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[a018731a-97eb-4719-8776-c464890ee3f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.289 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c9ea62d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.289 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.289 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c9ea62d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.291 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:05 np0005593294 NetworkManager[48978]: <info>  [1769163545.2918] manager: (tap5c9ea62d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 23 05:19:05 np0005593294 kernel: tap5c9ea62d-40: entered promiscuous mode
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.295 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5c9ea62d-40, col_values=(('external_ids', {'iface-id': '179215ec-6510-4ebf-a6e5-fe4278583ce3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.296 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:05 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:05Z|00038|binding|INFO|Releasing lport 179215ec-6510-4ebf-a6e5-fe4278583ce3 from this chassis (sb_readonly=0)
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.297 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.297 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5c9ea62d-4d78-4e2a-9702-db61ccfdb58a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5c9ea62d-4d78-4e2a-9702-db61ccfdb58a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.298 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[dd87451e-f19f-4b0e-9b69-6ad632e8cab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.299 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: global
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    log         /dev/log local0 debug
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    log-tag     haproxy-metadata-proxy-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    user        root
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    group       root
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    maxconn     1024
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    pidfile     /var/lib/neutron/external/pids/5c9ea62d-4d78-4e2a-9702-db61ccfdb58a.pid.haproxy
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    daemon
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: defaults
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    log global
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    mode http
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    option httplog
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    option dontlognull
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    option http-server-close
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    option forwardfor
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    retries                 3
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    timeout http-request    30s
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    timeout connect         30s
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    timeout client          32s
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    timeout server          32s
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    timeout http-keep-alive 30s
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: listen listener
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    bind 169.254.169.254:80
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]:    http-request add-header X-OVN-Network-ID 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:19:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.299 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'env', 'PROCESS_TAG=haproxy-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5c9ea62d-4d78-4e2a-9702-db61ccfdb58a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.308 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.496 225709 DEBUG nova.compute.manager [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.497 225709 DEBUG oslo_concurrency.lockutils [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.497 225709 DEBUG oslo_concurrency.lockutils [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.498 225709 DEBUG oslo_concurrency.lockutils [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.498 225709 DEBUG nova.compute.manager [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.498 225709 WARNING nova.compute.manager [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c for instance with vm_state active and task_state None.#033[00m
Jan 23 05:19:05 np0005593294 podman[230201]: 2026-01-23 10:19:05.667629523 +0000 UTC m=+0.047079020 container create c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:19:05 np0005593294 systemd[1]: Started libpod-conmon-c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad.scope.
Jan 23 05:19:05 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:19:05 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89c1e4de03bd97a8d8d560a5b0fc97bed6d4cbd47a0d6d1dbe06563b1dadaf91/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:19:05 np0005593294 podman[230201]: 2026-01-23 10:19:05.642152628 +0000 UTC m=+0.021602145 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:19:05 np0005593294 podman[230201]: 2026-01-23 10:19:05.750941246 +0000 UTC m=+0.130390743 container init c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:19:05 np0005593294 podman[230201]: 2026-01-23 10:19:05.756037397 +0000 UTC m=+0.135486894 container start c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:19:05 np0005593294 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [NOTICE]   (230220) : New worker (230222) forked
Jan 23 05:19:05 np0005593294 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [NOTICE]   (230220) : Loading success.
Jan 23 05:19:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:19:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:05.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:05.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.898 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.899 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.899 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.900 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:19:05 np0005593294 nova_compute[225705]: 2026-01-23 10:19:05.900 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.314 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:06 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:19:06 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1043744357' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.370 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.408 225709 DEBUG nova.network.neutron [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updated VIF entry in instance network info cache for port 35c98901-92ff-40ab-a9c4-0da34169949c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.409 225709 DEBUG nova.network.neutron [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.437 225709 DEBUG oslo_concurrency.lockutils [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.455 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.456 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:19:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.636 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.637 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4719MB free_disk=59.942726135253906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.637 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.638 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.718 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance ed3c80d1-b549-49d1-be66-00467e195256 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.718 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.719 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.753 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.813 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-35c98901-92ff-40ab-a9c4-0da34169949c" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.814 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-35c98901-92ff-40ab-a9c4-0da34169949c" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.834 225709 DEBUG nova.objects.instance [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'flavor' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.859 225709 DEBUG nova.virt.libvirt.vif [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.860 225709 DEBUG nova.network.os_vif_util [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.861 225709 DEBUG nova.network.os_vif_util [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.866 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.870 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.872 225709 DEBUG nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Attempting to detach device tap35c98901-92 from instance ed3c80d1-b549-49d1-be66-00467e195256 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.873 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] detach device xml: <interface type="ethernet">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <mac address="fa:16:3e:4c:1e:6d"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <model type="virtio"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <mtu size="1442"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <target dev="tap35c98901-92"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]: </interface>
Jan 23 05:19:06 np0005593294 nova_compute[225705]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.883 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.889 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface>not found in domain: <domain type='kvm' id='1'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <name>instance-00000003</name>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <uuid>ed3c80d1-b549-49d1-be66-00467e195256</uuid>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:19:05</nova:creationTime>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:port uuid="35c98901-92ff-40ab-a9c4-0da34169949c">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:19:06 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <memory unit='KiB'>131072</memory>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <resource>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <partition>/machine</partition>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </resource>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <sysinfo type='smbios'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='serial'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='uuid'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <boot dev='hd'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <smbios mode='sysinfo'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <vmcoreinfo state='on'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <vendor>AMD</vendor>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='x2apic'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc-deadline'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc_adjust'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='spec-ctrl'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='stibp'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='ssbd'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='cmp_legacy'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='overflow-recov'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='succor'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='ibrs'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='amd-ssbd'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='virt-ssbd'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='lbrv'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='tsc-scale'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='vmcb-clean'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='flushbyasid'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pause-filter'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pfthreshold'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='xsaves'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svm'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='require' name='topoext'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='npt'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <feature policy='disable' name='nrip-save'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <clock offset='utc'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <timer name='hpet' present='no'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <on_reboot>restart</on_reboot>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <on_crash>destroy</on_crash>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <disk type='network' device='disk'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk' index='2'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target dev='vda' bus='virtio'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='virtio-disk0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <disk type='network' device='cdrom'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk.config' index='1'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target dev='sda' bus='sata'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <readonly/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='sata0-0-0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pcie.0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='1' port='0x10'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.1'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='2' port='0x11'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.2'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='3' port='0x12'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.3'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='4' port='0x13'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.4'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='5' port='0x14'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.5'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='6' port='0x15'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.6'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='7' port='0x16'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.7'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='8' port='0x17'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.8'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='9' port='0x18'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.9'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='10' port='0x19'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.10'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='11' port='0x1a'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.11'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='12' port='0x1b'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.12'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='13' port='0x1c'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.13'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='14' port='0x1d'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.14'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='15' port='0x1e'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.15'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='16' port='0x1f'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.16'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='17' port='0x20'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.17'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='18' port='0x21'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.18'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='19' port='0x22'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.19'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='20' port='0x23'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.20'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='21' port='0x24'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.21'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='22' port='0x25'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.22'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='23' port='0x26'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.23'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='24' port='0x27'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.24'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target chassis='25' port='0x28'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.25'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model name='pcie-pci-bridge'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='pci.26'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='usb'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <controller type='sata' index='0'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='ide'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:42:a1:b7'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target dev='tape056b1c4-d8'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='net0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:4c:1e:6d'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target dev='tap35c98901-92'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='net1'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <serial type='pty'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target type='isa-serial' port='0'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:        <model name='isa-serial'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      </target>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <target type='serial' port='0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <input type='tablet' bus='usb'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='input0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <input type='mouse' bus='ps2'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='input1'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <input type='keyboard' bus='ps2'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='input2'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <listen type='address' address='::0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <audio id='1' type='none'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='video0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <watchdog model='itco' action='reset'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='watchdog0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </watchdog>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <memballoon model='virtio'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <stats period='10'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='balloon0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <rng model='virtio'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <alias name='rng0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <label>system_u:system_r:svirt_t:s0:c256,c378</label>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c256,c378</imagelabel>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <label>+107:+107</label>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:19:06 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:19:06 np0005593294 nova_compute[225705]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.900 225709 INFO nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully detached device tap35c98901-92 from instance ed3c80d1-b549-49d1-be66-00467e195256 from the persistent domain config.#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.901 225709 DEBUG nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] (1/8): Attempting to detach device tap35c98901-92 with device alias net1 from instance ed3c80d1-b549-49d1-be66-00467e195256 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.902 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] detach device xml: <interface type="ethernet">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <mac address="fa:16:3e:4c:1e:6d"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <model type="virtio"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <mtu size="1442"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <target dev="tap35c98901-92"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]: </interface>
Jan 23 05:19:06 np0005593294 nova_compute[225705]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:19:06 np0005593294 kernel: tap35c98901-92 (unregistering): left promiscuous mode
Jan 23 05:19:06 np0005593294 NetworkManager[48978]: <info>  [1769163546.9637] device (tap35c98901-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:19:06 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:06Z|00039|binding|INFO|Releasing lport 35c98901-92ff-40ab-a9c4-0da34169949c from this chassis (sb_readonly=0)
Jan 23 05:19:06 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:06Z|00040|binding|INFO|Setting lport 35c98901-92ff-40ab-a9c4-0da34169949c down in Southbound
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.968 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:06 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:06Z|00041|binding|INFO|Removing iface tap35c98901-92 ovn-installed in OVS
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.979 225709 DEBUG nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Received event <DeviceRemovedEvent: 1769163546.9786458, ed3c80d1-b549-49d1-be66-00467e195256 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:19:06 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.978 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:1e:6d 10.100.0.26'], port_security=['fa:16:3e:4c:1e:6d 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'ed3c80d1-b549-49d1-be66-00467e195256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b2d21e8-4b70-4725-bde5-4813c876e6bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=35c98901-92ff-40ab-a9c4-0da34169949c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:06 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.979 143098 INFO neutron.agent.ovn.metadata.agent [-] Port 35c98901-92ff-40ab-a9c4-0da34169949c in datapath 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a unbound from our chassis#033[00m
Jan 23 05:19:06 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.981 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.983 225709 DEBUG nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Start waiting for the detach event from libvirt for device tap35c98901-92 with device alias net1 for instance ed3c80d1-b549-49d1-be66-00467e195256 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:19:06 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.982 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d53214b7-6d5a-43e1-9a9c-6eff3a496ccb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:06 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.982 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a namespace which is not needed anymore#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.984 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:19:06 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.990 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface>not found in domain: <domain type='kvm' id='1'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <name>instance-00000003</name>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <uuid>ed3c80d1-b549-49d1-be66-00467e195256</uuid>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:19:05</nova:creationTime>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <nova:port uuid="35c98901-92ff-40ab-a9c4-0da34169949c">
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:19:06 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <memory unit='KiB'>131072</memory>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <resource>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <partition>/machine</partition>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  </resource>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:  <sysinfo type='smbios'>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:19:06 np0005593294 nova_compute[225705]:      <entry name='serial'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <entry name='uuid'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <boot dev='hd'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <smbios mode='sysinfo'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <vmcoreinfo state='on'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <vendor>AMD</vendor>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='x2apic'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc-deadline'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc_adjust'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='spec-ctrl'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='stibp'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='ssbd'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='cmp_legacy'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='overflow-recov'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='succor'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='ibrs'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='amd-ssbd'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='virt-ssbd'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='lbrv'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='tsc-scale'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='vmcb-clean'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='flushbyasid'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pause-filter'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pfthreshold'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='xsaves'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svm'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='require' name='topoext'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='npt'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <feature policy='disable' name='nrip-save'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <clock offset='utc'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <timer name='hpet' present='no'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <on_reboot>restart</on_reboot>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <on_crash>destroy</on_crash>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <disk type='network' device='disk'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk' index='2'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target dev='vda' bus='virtio'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='virtio-disk0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <disk type='network' device='cdrom'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk.config' index='1'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target dev='sda' bus='sata'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <readonly/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='sata0-0-0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pcie.0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='1' port='0x10'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.1'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='2' port='0x11'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.2'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='3' port='0x12'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.3'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='4' port='0x13'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.4'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='5' port='0x14'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.5'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='6' port='0x15'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.6'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='7' port='0x16'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.7'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='8' port='0x17'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.8'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='9' port='0x18'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.9'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='10' port='0x19'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.10'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='11' port='0x1a'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.11'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='12' port='0x1b'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.12'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='13' port='0x1c'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.13'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='14' port='0x1d'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.14'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='15' port='0x1e'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.15'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='16' port='0x1f'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.16'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='17' port='0x20'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.17'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='18' port='0x21'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.18'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='19' port='0x22'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.19'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='20' port='0x23'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.20'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='21' port='0x24'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.21'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='22' port='0x25'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.22'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='23' port='0x26'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.23'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='24' port='0x27'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.24'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target chassis='25' port='0x28'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.25'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model name='pcie-pci-bridge'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='pci.26'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='usb'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <controller type='sata' index='0'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='ide'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:42:a1:b7'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target dev='tape056b1c4-d8'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='net0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <serial type='pty'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target type='isa-serial' port='0'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:        <model name='isa-serial'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      </target>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <target type='serial' port='0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <input type='tablet' bus='usb'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='input0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <input type='mouse' bus='ps2'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='input1'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <input type='keyboard' bus='ps2'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='input2'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <listen type='address' address='::0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <audio id='1' type='none'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='video0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <watchdog model='itco' action='reset'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='watchdog0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </watchdog>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <memballoon model='virtio'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <stats period='10'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='balloon0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <rng model='virtio'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <alias name='rng0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <label>system_u:system_r:svirt_t:s0:c256,c378</label>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c256,c378</imagelabel>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <label>+107:+107</label>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:19:07 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:19:07 np0005593294 nova_compute[225705]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:06.998 225709 INFO nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully detached device tap35c98901-92 from instance ed3c80d1-b549-49d1-be66-00467e195256 from the live domain config.#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.000 225709 DEBUG nova.virt.libvirt.vif [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.001 225709 DEBUG nova.network.os_vif_util [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.002 225709 DEBUG nova.network.os_vif_util [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.003 225709 DEBUG os_vif [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.009 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.010 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35c98901-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.012 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.014 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.017 225709 INFO os_vif [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92')#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.018 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:19:07</nova:creationTime>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 05:19:07 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:07 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:19:07 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:19:07 np0005593294 nova_compute[225705]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 05:19:07 np0005593294 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [NOTICE]   (230220) : haproxy version is 2.8.14-c23fe91
Jan 23 05:19:07 np0005593294 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [NOTICE]   (230220) : path to executable is /usr/sbin/haproxy
Jan 23 05:19:07 np0005593294 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [ALERT]    (230220) : Current worker (230222) exited with code 143 (Terminated)
Jan 23 05:19:07 np0005593294 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [WARNING]  (230220) : All workers exited. Exiting... (0)
Jan 23 05:19:07 np0005593294 systemd[1]: libpod-c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad.scope: Deactivated successfully.
Jan 23 05:19:07 np0005593294 podman[230296]: 2026-01-23 10:19:07.127069815 +0000 UTC m=+0.058045815 container died c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:19:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:07 np0005593294 systemd[1]: var-lib-containers-storage-overlay-89c1e4de03bd97a8d8d560a5b0fc97bed6d4cbd47a0d6d1dbe06563b1dadaf91-merged.mount: Deactivated successfully.
Jan 23 05:19:07 np0005593294 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad-userdata-shm.mount: Deactivated successfully.
Jan 23 05:19:07 np0005593294 podman[230296]: 2026-01-23 10:19:07.174608298 +0000 UTC m=+0.105584278 container cleanup c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:19:07 np0005593294 systemd[1]: libpod-conmon-c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad.scope: Deactivated successfully.
Jan 23 05:19:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:19:07 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1377108216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.244 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:07 np0005593294 podman[230328]: 2026-01-23 10:19:07.244868929 +0000 UTC m=+0.048757021 container remove c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.249 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:19:07 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.251 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa18e58-2b74-42de-842c-fc888ae4d311]: (4, ('Fri Jan 23 10:19:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a (c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad)\nc3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad\nFri Jan 23 10:19:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a (c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad)\nc3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:07 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.252 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[11b42957-3506-467b-8405-1422f38f6c52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:07 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.253 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c9ea62d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.255 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:07 np0005593294 kernel: tap5c9ea62d-40: left promiscuous mode
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.269 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:07 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.273 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[57dbed41-dd22-4ef8-998c-fc7ebed18c83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.284 225709 ERROR nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [req-22f68184-d6e8-4b1f-a131-e6c2a286d387] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID b22b6ed5-7bca-42dc-9b99-6f2ad6853af7.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-22f68184-d6e8-4b1f-a131-e6c2a286d387"}]}#033[00m
Jan 23 05:19:07 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.290 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2e7789-9964-4558-8ef6-ad2e9005d5c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:07 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.292 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[26166fd5-328b-401d-958b-e43f782be4e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.301 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:19:07 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.309 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd638d4-299a-493d-830c-fb971964a847]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469904, 'reachable_time': 36842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230345, 'error': None, 'target': 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:07 np0005593294 systemd[1]: run-netns-ovnmeta\x2d5c9ea62d\x2d4d78\x2d4e2a\x2d9702\x2ddb61ccfdb58a.mount: Deactivated successfully.
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.321 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.321 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:19:07 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.321 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:19:07 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.322 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[50774e63-3ccb-4198-8cc1-03480a622fda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.334 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.369 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.415 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.658 225709 DEBUG nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.659 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.659 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.659 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.660 225709 DEBUG nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.660 225709 WARNING nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c for instance with vm_state active and task_state None.#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.660 225709 DEBUG nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-unplugged-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.661 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.661 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.661 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.662 225709 DEBUG nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-unplugged-35c98901-92ff-40ab-a9c4-0da34169949c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.662 225709 WARNING nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-unplugged-35c98901-92ff-40ab-a9c4-0da34169949c for instance with vm_state active and task_state None.#033[00m
Jan 23 05:19:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:07.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:07.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:19:07 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3477460864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.926 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:07 np0005593294 nova_compute[225705]: 2026-01-23 10:19:07.932 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:19:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:08 np0005593294 nova_compute[225705]: 2026-01-23 10:19:08.277 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updated inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 23 05:19:08 np0005593294 nova_compute[225705]: 2026-01-23 10:19:08.278 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 23 05:19:08 np0005593294 nova_compute[225705]: 2026-01-23 10:19:08.278 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:19:08 np0005593294 nova_compute[225705]: 2026-01-23 10:19:08.302 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:19:08 np0005593294 nova_compute[225705]: 2026-01-23 10:19:08.302 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:08 np0005593294 nova_compute[225705]: 2026-01-23 10:19:08.558 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:08 np0005593294 nova_compute[225705]: 2026-01-23 10:19:08.559 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:08 np0005593294 nova_compute[225705]: 2026-01-23 10:19:08.559 225709 DEBUG nova.network.neutron [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:19:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101909 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.297 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.330 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.814 225709 DEBUG nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.815 225709 DEBUG oslo_concurrency.lockutils [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.815 225709 DEBUG oslo_concurrency.lockutils [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.816 225709 DEBUG oslo_concurrency.lockutils [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.816 225709 DEBUG nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.817 225709 WARNING nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c for instance with vm_state active and task_state None.#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.817 225709 DEBUG nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-deleted-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.818 225709 INFO nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Neutron deleted interface 35c98901-92ff-40ab-a9c4-0da34169949c; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.818 225709 DEBUG nova.network.neutron [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.846 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:09 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:09.847 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:09 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:09.849 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.852 225709 DEBUG nova.objects.instance [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lazy-loading 'system_metadata' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:09.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.906 225709 DEBUG nova.objects.instance [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lazy-loading 'flavor' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.941 225709 DEBUG nova.virt.libvirt.vif [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.941 225709 DEBUG nova.network.os_vif_util [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.942 225709 DEBUG nova.network.os_vif_util [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.946 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.949 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface>not found in domain: <domain type='kvm' id='1'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <name>instance-00000003</name>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <uuid>ed3c80d1-b549-49d1-be66-00467e195256</uuid>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:19:07</nova:creationTime>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:19:09 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <memory unit='KiB'>131072</memory>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <resource>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <partition>/machine</partition>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </resource>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <sysinfo type='smbios'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='serial'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='uuid'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <boot dev='hd'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <smbios mode='sysinfo'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <vmcoreinfo state='on'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <vendor>AMD</vendor>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='x2apic'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc-deadline'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc_adjust'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='spec-ctrl'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='stibp'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='ssbd'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='cmp_legacy'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='overflow-recov'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='succor'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='ibrs'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='amd-ssbd'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='virt-ssbd'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='lbrv'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='tsc-scale'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='vmcb-clean'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='flushbyasid'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pause-filter'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pfthreshold'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='xsaves'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svm'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='topoext'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='npt'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='nrip-save'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <clock offset='utc'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <timer name='hpet' present='no'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <on_reboot>restart</on_reboot>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <on_crash>destroy</on_crash>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <disk type='network' device='disk'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk' index='2'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target dev='vda' bus='virtio'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='virtio-disk0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <disk type='network' device='cdrom'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk.config' index='1'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target dev='sda' bus='sata'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <readonly/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='sata0-0-0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pcie.0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='1' port='0x10'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='2' port='0x11'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='3' port='0x12'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.3'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='4' port='0x13'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.4'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='5' port='0x14'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.5'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='6' port='0x15'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.6'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='7' port='0x16'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='8' port='0x17'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.8'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='9' port='0x18'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.9'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='10' port='0x19'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.10'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='11' port='0x1a'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.11'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='12' port='0x1b'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.12'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='13' port='0x1c'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.13'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='14' port='0x1d'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.14'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='15' port='0x1e'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.15'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='16' port='0x1f'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.16'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='17' port='0x20'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.17'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='18' port='0x21'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.18'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='19' port='0x22'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.19'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='20' port='0x23'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.20'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='21' port='0x24'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.21'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='22' port='0x25'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.22'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='23' port='0x26'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.23'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='24' port='0x27'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.24'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='25' port='0x28'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.25'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-pci-bridge'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.26'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='usb'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='sata' index='0'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='ide'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:42:a1:b7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target dev='tape056b1c4-d8'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='net0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <serial type='pty'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target type='isa-serial' port='0'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <model name='isa-serial'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </target>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target type='serial' port='0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <input type='tablet' bus='usb'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='input0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <input type='mouse' bus='ps2'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='input1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <input type='keyboard' bus='ps2'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='input2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <listen type='address' address='::0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <audio id='1' type='none'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='video0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <watchdog model='itco' action='reset'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='watchdog0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </watchdog>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <memballoon model='virtio'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <stats period='10'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='balloon0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <rng model='virtio'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='rng0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <label>system_u:system_r:svirt_t:s0:c256,c378</label>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c256,c378</imagelabel>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <label>+107:+107</label>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:19:09 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:19:09 np0005593294 nova_compute[225705]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.950 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.953 225709 INFO nova.network.neutron [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Port 35c98901-92ff-40ab-a9c4-0da34169949c from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.954 225709 DEBUG nova.network.neutron [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.960 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface>not found in domain: <domain type='kvm' id='1'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <name>instance-00000003</name>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <uuid>ed3c80d1-b549-49d1-be66-00467e195256</uuid>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:19:07</nova:creationTime>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:19:09 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <memory unit='KiB'>131072</memory>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <resource>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <partition>/machine</partition>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </resource>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <sysinfo type='smbios'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='serial'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='uuid'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <boot dev='hd'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <smbios mode='sysinfo'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <vmcoreinfo state='on'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <vendor>AMD</vendor>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='x2apic'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc-deadline'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc_adjust'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='spec-ctrl'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='stibp'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='ssbd'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='cmp_legacy'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='overflow-recov'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='succor'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='ibrs'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='amd-ssbd'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='virt-ssbd'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='lbrv'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='tsc-scale'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='vmcb-clean'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='flushbyasid'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pause-filter'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pfthreshold'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='xsaves'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svm'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='topoext'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='npt'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='nrip-save'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <clock offset='utc'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <timer name='hpet' present='no'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <on_reboot>restart</on_reboot>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <on_crash>destroy</on_crash>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <disk type='network' device='disk'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk' index='2'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target dev='vda' bus='virtio'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='virtio-disk0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <disk type='network' device='cdrom'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk.config' index='1'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target dev='sda' bus='sata'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <readonly/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='sata0-0-0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pcie.0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='1' port='0x10'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='2' port='0x11'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='3' port='0x12'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.3'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='4' port='0x13'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.4'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='5' port='0x14'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.5'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='6' port='0x15'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.6'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='7' port='0x16'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='8' port='0x17'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.8'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='9' port='0x18'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.9'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='10' port='0x19'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.10'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='11' port='0x1a'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.11'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='12' port='0x1b'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.12'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='13' port='0x1c'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.13'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='14' port='0x1d'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.14'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='15' port='0x1e'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.15'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='16' port='0x1f'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.16'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='17' port='0x20'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.17'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='18' port='0x21'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.18'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='19' port='0x22'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.19'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='20' port='0x23'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.20'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='21' port='0x24'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.21'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='22' port='0x25'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.22'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='23' port='0x26'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.23'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='24' port='0x27'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.24'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target chassis='25' port='0x28'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.25'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model name='pcie-pci-bridge'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='pci.26'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='usb'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <controller type='sata' index='0'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='ide'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:42:a1:b7'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target dev='tape056b1c4-d8'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='net0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <serial type='pty'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target type='isa-serial' port='0'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:        <model name='isa-serial'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      </target>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <target type='serial' port='0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <input type='tablet' bus='usb'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='input0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <input type='mouse' bus='ps2'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='input1'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <input type='keyboard' bus='ps2'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='input2'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <listen type='address' address='::0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <audio id='1' type='none'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='video0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <watchdog model='itco' action='reset'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='watchdog0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </watchdog>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <memballoon model='virtio'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <stats period='10'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='balloon0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <rng model='virtio'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <alias name='rng0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <label>system_u:system_r:svirt_t:s0:c256,c378</label>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c256,c378</imagelabel>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <label>+107:+107</label>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:19:09 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:19:09 np0005593294 nova_compute[225705]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.962 225709 WARNING nova.virt.libvirt.driver [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Detaching interface fa:16:3e:4c:1e:6d failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap35c98901-92' not found.#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.963 225709 DEBUG nova.virt.libvirt.vif [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.963 225709 DEBUG nova.network.os_vif_util [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.963 225709 DEBUG nova.network.os_vif_util [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.964 225709 DEBUG os_vif [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.966 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.966 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35c98901-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.967 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.969 225709 INFO os_vif [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92')#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.970 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:19:09</nova:creationTime>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 05:19:09 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:19:09 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:19:09 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:19:09 np0005593294 nova_compute[225705]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.973 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:09 np0005593294 nova_compute[225705]: 2026-01-23 10:19:09.990 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-35c98901-92ff-40ab-a9c4-0da34169949c" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:10 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:10Z|00042|binding|INFO|Releasing lport 572285ac-9ff4-42d8-9b72-b5588035f74c from this chassis (sb_readonly=0)
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.261 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.791 225709 DEBUG nova.compute.manager [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.791 225709 DEBUG nova.compute.manager [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing instance network info cache due to event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.791 225709 DEBUG oslo_concurrency.lockutils [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.792 225709 DEBUG oslo_concurrency.lockutils [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.792 225709 DEBUG nova.network.neutron [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.854 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.855 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.855 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.855 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.856 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.857 225709 INFO nova.compute.manager [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Terminating instance#033[00m
Jan 23 05:19:10 np0005593294 nova_compute[225705]: 2026-01-23 10:19:10.858 225709 DEBUG nova.compute.manager [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:19:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0003430 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:11 np0005593294 nova_compute[225705]: 2026-01-23 10:19:11.317 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:11 np0005593294 kernel: tape056b1c4-d8 (unregistering): left promiscuous mode
Jan 23 05:19:11 np0005593294 NetworkManager[48978]: <info>  [1769163551.8087] device (tape056b1c4-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:19:11 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:11Z|00043|binding|INFO|Releasing lport e056b1c4-d8ee-40be-ab65-dad6851e9340 from this chassis (sb_readonly=0)
Jan 23 05:19:11 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:11Z|00044|binding|INFO|Setting lport e056b1c4-d8ee-40be-ab65-dad6851e9340 down in Southbound
Jan 23 05:19:11 np0005593294 nova_compute[225705]: 2026-01-23 10:19:11.813 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:11 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:11Z|00045|binding|INFO|Removing iface tape056b1c4-d8 ovn-installed in OVS
Jan 23 05:19:11 np0005593294 nova_compute[225705]: 2026-01-23 10:19:11.815 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.820 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:a1:b7 10.100.0.13'], port_security=['fa:16:3e:42:a1:b7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ed3c80d1-b549-49d1-be66-00467e195256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '96259b98-6654-41f6-bfeb-290c4063344e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93789b9e-064c-44b7-b00b-f52ca7e4569d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=e056b1c4-d8ee-40be-ab65-dad6851e9340) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.821 143098 INFO neutron.agent.ovn.metadata.agent [-] Port e056b1c4-d8ee-40be-ab65-dad6851e9340 in datapath 4f467dc5-4a9f-42dc-990e-a2a671c8b09c unbound from our chassis#033[00m
Jan 23 05:19:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.822 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f467dc5-4a9f-42dc-990e-a2a671c8b09c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:19:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.823 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[628bb562-1f1f-44c5-893b-9ade97e3c9fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.824 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c namespace which is not needed anymore#033[00m
Jan 23 05:19:11 np0005593294 nova_compute[225705]: 2026-01-23 10:19:11.830 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:11.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:11 np0005593294 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 23 05:19:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:11.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:11 np0005593294 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 14.726s CPU time.
Jan 23 05:19:11 np0005593294 systemd-machined[194551]: Machine qemu-1-instance-00000003 terminated.
Jan 23 05:19:11 np0005593294 nova_compute[225705]: 2026-01-23 10:19:11.921 225709 DEBUG nova.network.neutron [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updated VIF entry in instance network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:19:11 np0005593294 nova_compute[225705]: 2026-01-23 10:19:11.923 225709 DEBUG nova.network.neutron [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:11 np0005593294 nova_compute[225705]: 2026-01-23 10:19:11.940 225709 DEBUG oslo_concurrency.lockutils [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:11 np0005593294 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [NOTICE]   (229998) : haproxy version is 2.8.14-c23fe91
Jan 23 05:19:11 np0005593294 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [NOTICE]   (229998) : path to executable is /usr/sbin/haproxy
Jan 23 05:19:11 np0005593294 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [WARNING]  (229998) : Exiting Master process...
Jan 23 05:19:11 np0005593294 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [ALERT]    (229998) : Current worker (230000) exited with code 143 (Terminated)
Jan 23 05:19:11 np0005593294 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [WARNING]  (229998) : All workers exited. Exiting... (0)
Jan 23 05:19:11 np0005593294 systemd[1]: libpod-d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f.scope: Deactivated successfully.
Jan 23 05:19:11 np0005593294 podman[230396]: 2026-01-23 10:19:11.972366187 +0000 UTC m=+0.059337667 container died d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.007 225709 DEBUG nova.compute.manager [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-unplugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.008 225709 DEBUG oslo_concurrency.lockutils [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.008 225709 DEBUG oslo_concurrency.lockutils [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.009 225709 DEBUG oslo_concurrency.lockutils [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.009 225709 DEBUG nova.compute.manager [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-unplugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.009 225709 DEBUG nova.compute.manager [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-unplugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.012 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.083 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.091 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.100 225709 INFO nova.virt.libvirt.driver [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Instance destroyed successfully.#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.100 225709 DEBUG nova.objects.instance [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.116 225709 DEBUG nova.virt.libvirt.vif [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.117 225709 DEBUG nova.network.os_vif_util [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.117 225709 DEBUG nova.network.os_vif_util [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.118 225709 DEBUG os_vif [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.119 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.120 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape056b1c4-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.126 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:12 np0005593294 nova_compute[225705]: 2026-01-23 10:19:12.129 225709 INFO os_vif [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8')#033[00m
Jan 23 05:19:12 np0005593294 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f-userdata-shm.mount: Deactivated successfully.
Jan 23 05:19:12 np0005593294 systemd[1]: var-lib-containers-storage-overlay-38a430a453066cd300215ffab9c681910b2ee216372ea5d2773756ffea2ac606-merged.mount: Deactivated successfully.
Jan 23 05:19:12 np0005593294 podman[230396]: 2026-01-23 10:19:12.284209974 +0000 UTC m=+0.371181454 container cleanup d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 05:19:12 np0005593294 systemd[1]: libpod-conmon-d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f.scope: Deactivated successfully.
Jan 23 05:19:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:13 np0005593294 podman[230455]: 2026-01-23 10:19:13.159816972 +0000 UTC m=+0.849082270 container remove d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:19:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.166 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e54c0d2a-6dcb-4d34-90ef-4458c51dc2c0]: (4, ('Fri Jan 23 10:19:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c (d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f)\nd1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f\nFri Jan 23 10:19:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c (d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f)\nd1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.168 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[05cedbd5-9b56-4cf2-b8df-0245ec391ea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.168 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f467dc5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:13 np0005593294 nova_compute[225705]: 2026-01-23 10:19:13.171 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:13 np0005593294 kernel: tap4f467dc5-40: left promiscuous mode
Jan 23 05:19:13 np0005593294 nova_compute[225705]: 2026-01-23 10:19:13.185 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.188 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[106658e0-5b25-4f37-b8d1-e374b23af90e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.210 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c33bd82a-c3a3-45cc-ae4c-a9d6c6b9ea98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.212 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc50864-fcd7-4e24-acc8-4861f08f7ec0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.233 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[37a2df42-f322-4271-b0c3-0c4a9b934490]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466876, 'reachable_time': 19041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230471, 'error': None, 'target': 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.237 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:19:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.237 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6e87f3-67c0-413e-bebb-d832b0059f61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:13 np0005593294 systemd[1]: run-netns-ovnmeta\x2d4f467dc5\x2d4a9f\x2d42dc\x2d990e\x2da2a671c8b09c.mount: Deactivated successfully.
Jan 23 05:19:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:13.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:13.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0003430 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.095 225709 DEBUG nova.compute.manager [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.095 225709 DEBUG oslo_concurrency.lockutils [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.095 225709 DEBUG oslo_concurrency.lockutils [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.096 225709 DEBUG oslo_concurrency.lockutils [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.096 225709 DEBUG nova.compute.manager [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.096 225709 WARNING nova.compute.manager [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:19:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.521 225709 INFO nova.virt.libvirt.driver [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Deleting instance files /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256_del#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.523 225709 INFO nova.virt.libvirt.driver [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Deletion of /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256_del complete#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.573 225709 DEBUG nova.virt.libvirt.host [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.573 225709 INFO nova.virt.libvirt.host [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] UEFI support detected#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.576 225709 INFO nova.compute.manager [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Took 3.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.577 225709 DEBUG oslo.service.loopingcall [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.577 225709 DEBUG nova.compute.manager [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:19:14 np0005593294 nova_compute[225705]: 2026-01-23 10:19:14.577 225709 DEBUG nova.network.neutron [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:19:14 np0005593294 podman[230474]: 2026-01-23 10:19:14.706792804 +0000 UTC m=+0.105830376 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:19:14 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:14.853 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004260 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:15.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:15.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:16 np0005593294 nova_compute[225705]: 2026-01-23 10:19:16.319 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:16 np0005593294 nova_compute[225705]: 2026-01-23 10:19:16.389 225709 DEBUG nova.network.neutron [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:16 np0005593294 nova_compute[225705]: 2026-01-23 10:19:16.406 225709 INFO nova.compute.manager [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Took 1.83 seconds to deallocate network for instance.#033[00m
Jan 23 05:19:16 np0005593294 nova_compute[225705]: 2026-01-23 10:19:16.463 225709 DEBUG nova.compute.manager [req-14d0e6d9-14cb-478c-89a0-bd899dc08df6 req-7923d7bd-ef43-4c21-8512-87132622ad97 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-deleted-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:16 np0005593294 nova_compute[225705]: 2026-01-23 10:19:16.468 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:16 np0005593294 nova_compute[225705]: 2026-01-23 10:19:16.468 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:16 np0005593294 nova_compute[225705]: 2026-01-23 10:19:16.517 225709 DEBUG oslo_concurrency.processutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:17 np0005593294 nova_compute[225705]: 2026-01-23 10:19:17.031 225709 DEBUG oslo_concurrency.processutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:17 np0005593294 nova_compute[225705]: 2026-01-23 10:19:17.039 225709 DEBUG nova.compute.provider_tree [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:19:17 np0005593294 nova_compute[225705]: 2026-01-23 10:19:17.064 225709 DEBUG nova.scheduler.client.report [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:19:17 np0005593294 nova_compute[225705]: 2026-01-23 10:19:17.086 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:17 np0005593294 nova_compute[225705]: 2026-01-23 10:19:17.117 225709 INFO nova.scheduler.client.report [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance ed3c80d1-b549-49d1-be66-00467e195256#033[00m
Jan 23 05:19:17 np0005593294 nova_compute[225705]: 2026-01-23 10:19:17.123 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:17 np0005593294 nova_compute[225705]: 2026-01-23 10:19:17.201 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:19:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:19:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:17.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:17.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004280 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:19.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:19.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:21 np0005593294 nova_compute[225705]: 2026-01-23 10:19:21.368 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:21.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:21.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:21 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:21 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:22 np0005593294 nova_compute[225705]: 2026-01-23 10:19:22.125 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:22 np0005593294 nova_compute[225705]: 2026-01-23 10:19:22.457 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:22 np0005593294 nova_compute[225705]: 2026-01-23 10:19:22.523 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:23.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:23.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:25.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:25.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:26 np0005593294 nova_compute[225705]: 2026-01-23 10:19:26.371 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:27 np0005593294 nova_compute[225705]: 2026-01-23 10:19:27.099 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163552.0976357, ed3c80d1-b549-49d1-be66-00467e195256 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:27 np0005593294 nova_compute[225705]: 2026-01-23 10:19:27.100 225709 INFO nova.compute.manager [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:19:27 np0005593294 nova_compute[225705]: 2026-01-23 10:19:27.125 225709 DEBUG nova.compute.manager [None req-83158962-eff8-4a10-ae7d-f24339bf8aec - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:27 np0005593294 nova_compute[225705]: 2026-01-23 10:19:27.128 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:27.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:27.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:29.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:30.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:31 np0005593294 nova_compute[225705]: 2026-01-23 10:19:31.373 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:31.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:32 np0005593294 nova_compute[225705]: 2026-01-23 10:19:32.131 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:32.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:32 np0005593294 podman[230659]: 2026-01-23 10:19:32.715060698 +0000 UTC m=+0.106097065 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:19:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:33.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:34.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:35.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:36 np0005593294 nova_compute[225705]: 2026-01-23 10:19:36.374 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:36.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:37 np0005593294 nova_compute[225705]: 2026-01-23 10:19:37.134 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:37.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:38.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:39 np0005593294 nova_compute[225705]: 2026-01-23 10:19:39.462 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:39 np0005593294 nova_compute[225705]: 2026-01-23 10:19:39.463 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:39 np0005593294 nova_compute[225705]: 2026-01-23 10:19:39.481 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:19:39 np0005593294 nova_compute[225705]: 2026-01-23 10:19:39.548 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:39 np0005593294 nova_compute[225705]: 2026-01-23 10:19:39.549 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:39 np0005593294 nova_compute[225705]: 2026-01-23 10:19:39.554 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:19:39 np0005593294 nova_compute[225705]: 2026-01-23 10:19:39.555 225709 INFO nova.compute.claims [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:19:39 np0005593294 nova_compute[225705]: 2026-01-23 10:19:39.661 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:39.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:40 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:19:40 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3401891283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.110 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.119 225709 DEBUG nova.compute.provider_tree [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.389 225709 DEBUG nova.scheduler.client.report [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.425 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.876s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.426 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:19:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.470 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.471 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.489 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:19:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:40.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.507 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:19:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.595 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.596 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.597 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Creating image(s)#033[00m
Jan 23 05:19:40 np0005593294 nova_compute[225705]: 2026-01-23 10:19:40.624 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.018 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.052 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.057 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.080 225709 DEBUG nova.policy [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.117 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.118 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.119 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.120 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.148 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.152 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:41 np0005593294 nova_compute[225705]: 2026-01-23 10:19:41.376 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:41.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:42 np0005593294 nova_compute[225705]: 2026-01-23 10:19:42.129 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Successfully created port: 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:19:42 np0005593294 nova_compute[225705]: 2026-01-23 10:19:42.136 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:42.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:42 np0005593294 nova_compute[225705]: 2026-01-23 10:19:42.707 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.554s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:42 np0005593294 nova_compute[225705]: 2026-01-23 10:19:42.796 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.025 225709 DEBUG nova.objects.instance [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.045 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.046 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Ensure instance console log exists: /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.047 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.048 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.048 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.568 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Successfully updated port: 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.596 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.596 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.596 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.713 225709 DEBUG nova.compute.manager [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.713 225709 DEBUG nova.compute.manager [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing instance network info cache due to event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:19:43 np0005593294 nova_compute[225705]: 2026-01-23 10:19:43.713 225709 DEBUG oslo_concurrency.lockutils [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:43.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:44 np0005593294 nova_compute[225705]: 2026-01-23 10:19:44.478 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:19:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:44.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:45 np0005593294 podman[230907]: 2026-01-23 10:19:45.650258754 +0000 UTC m=+0.056480007 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:19:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:45.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.378 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.499 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:19:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:46.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.521 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.521 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance network_info: |[{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.522 225709 DEBUG oslo_concurrency.lockutils [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.522 225709 DEBUG nova.network.neutron [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.525 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start _get_guest_xml network_info=[{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encrypted': False, 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encryption_format': None, 'boot_index': 0, 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.529 225709 WARNING nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.532 225709 DEBUG nova.virt.libvirt.host [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.532 225709 DEBUG nova.virt.libvirt.host [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:19:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.534 225709 DEBUG nova.virt.libvirt.host [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.535 225709 DEBUG nova.virt.libvirt.host [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.535 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.535 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:19:46 np0005593294 nova_compute[225705]: 2026-01-23 10:19:46.540 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:19:47 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2913217029' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.125 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.585s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.159 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.165 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.186 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:19:47 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3310066042' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.622 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.623 225709 DEBUG nova.virt.libvirt.vif [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:19:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-263648847',display_name='tempest-TestNetworkBasicOps-server-263648847',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-263648847',id=4,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD66+yyfhusLfImf1uKQDIRs2V4/o1F8Eh/Z0SqxwU6ND9wcYC22x/WBjiGE3hyxp/MbA2OwCQVvqoeXerAOWd4tdGub7VFIxVXpqt6OghLL3nU7rH27QJ0mug8wzMNOTA==',key_name='tempest-TestNetworkBasicOps-501443179',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-rvp4p09d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:19:40Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.624 225709 DEBUG nova.network.os_vif_util [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.625 225709 DEBUG nova.network.os_vif_util [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.626 225709 DEBUG nova.objects.instance [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.763 225709 DEBUG nova.network.neutron [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated VIF entry in instance network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.763 225709 DEBUG nova.network.neutron [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.800 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <uuid>87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334</uuid>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <name>instance-00000004</name>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <memory>131072</memory>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <vcpu>1</vcpu>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <nova:name>tempest-TestNetworkBasicOps-server-263648847</nova:name>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <nova:creationTime>2026-01-23 10:19:46</nova:creationTime>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <nova:flavor name="m1.nano">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <nova:memory>128</nova:memory>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <nova:disk>1</nova:disk>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <nova:swap>0</nova:swap>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      </nova:flavor>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <nova:owner>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      </nova:owner>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <nova:ports>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <nova:port uuid="5def28f3-3bf5-4f1f-8e37-51794dbddfc6">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        </nova:port>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      </nova:ports>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    </nova:instance>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <sysinfo type="smbios">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <entry name="serial">87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334</entry>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <entry name="uuid">87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334</entry>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <boot dev="hd"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <smbios mode="sysinfo"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <vmcoreinfo/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <clock offset="utc">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <timer name="hpet" present="no"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <cpu mode="host-model" match="exact">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <disk type="network" device="disk">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <driver type="raw" cache="none"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <source protocol="rbd" name="vms/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <auth username="openstack">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <target dev="vda" bus="virtio"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <disk type="network" device="cdrom">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <driver type="raw" cache="none"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <source protocol="rbd" name="vms/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <auth username="openstack">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <target dev="sda" bus="sata"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <interface type="ethernet">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <mac address="fa:16:3e:d5:a8:9e"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <model type="virtio"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <mtu size="1442"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <target dev="tap5def28f3-3b"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <serial type="pty">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <log file="/var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/console.log" append="off"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <model type="virtio"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <input type="tablet" bus="usb"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <rng model="virtio">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <controller type="usb" index="0"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    <memballoon model="virtio">
Jan 23 05:19:47 np0005593294 nova_compute[225705]:      <stats period="10"/>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:19:47 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:19:47 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:19:47 np0005593294 nova_compute[225705]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.801 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Preparing to wait for external event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.801 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.801 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.802 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.802 225709 DEBUG nova.virt.libvirt.vif [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:19:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-263648847',display_name='tempest-TestNetworkBasicOps-server-263648847',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-263648847',id=4,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD66+yyfhusLfImf1uKQDIRs2V4/o1F8Eh/Z0SqxwU6ND9wcYC22x/WBjiGE3hyxp/MbA2OwCQVvqoeXerAOWd4tdGub7VFIxVXpqt6OghLL3nU7rH27QJ0mug8wzMNOTA==',key_name='tempest-TestNetworkBasicOps-501443179',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-rvp4p09d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:19:40Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.803 225709 DEBUG nova.network.os_vif_util [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.803 225709 DEBUG nova.network.os_vif_util [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.803 225709 DEBUG os_vif [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.804 225709 DEBUG oslo_concurrency.lockutils [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.804 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.805 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.805 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.808 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.808 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5def28f3-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.809 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5def28f3-3b, col_values=(('external_ids', {'iface-id': '5def28f3-3bf5-4f1f-8e37-51794dbddfc6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:a8:9e', 'vm-uuid': '87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.811 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:47 np0005593294 NetworkManager[48978]: <info>  [1769163587.8126] manager: (tap5def28f3-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.813 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.820 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:47 np0005593294 nova_compute[225705]: 2026-01-23 10:19:47.821 225709 INFO os_vif [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b')#033[00m
Jan 23 05:19:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:47.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.037 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.037 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.037 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:d5:a8:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.038 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Using config drive#033[00m
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.116 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:48.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.808 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Creating config drive at /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config#033[00m
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.813 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0x9frigc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.944 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0x9frigc" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.991 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:48 np0005593294 nova_compute[225705]: 2026-01-23 10:19:48.996 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.577 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.581s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.579 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Deleting local config drive /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config because it was imported into RBD.#033[00m
Jan 23 05:19:49 np0005593294 kernel: tap5def28f3-3b: entered promiscuous mode
Jan 23 05:19:49 np0005593294 NetworkManager[48978]: <info>  [1769163589.6504] manager: (tap5def28f3-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Jan 23 05:19:49 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:49Z|00046|binding|INFO|Claiming lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 for this chassis.
Jan 23 05:19:49 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:49Z|00047|binding|INFO|5def28f3-3bf5-4f1f-8e37-51794dbddfc6: Claiming fa:16:3e:d5:a8:9e 10.100.0.8
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.650 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.654 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.669 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a8:9e 10.100.0.8'], port_security=['fa:16:3e:d5:a8:9e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13f52d23-9898-43a0-a951-b69cb2abebab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '077cd29b-8d1e-4ab1-b762-8cd58191c522', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30316727-f942-4d99-94ec-26d1184b5c8a, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=5def28f3-3bf5-4f1f-8e37-51794dbddfc6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.671 143098 INFO neutron.agent.ovn.metadata.agent [-] Port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 in datapath 13f52d23-9898-43a0-a951-b69cb2abebab bound to our chassis#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.672 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 13f52d23-9898-43a0-a951-b69cb2abebab#033[00m
Jan 23 05:19:49 np0005593294 systemd-udevd[231064]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:19:49 np0005593294 systemd-machined[194551]: New machine qemu-2-instance-00000004.
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.686 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e166fe2c-ea73-45ab-b57c-4e08a4a79c5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.687 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap13f52d23-91 in ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.689 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap13f52d23-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.689 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[df06d815-f0ec-4985-b465-452af119deae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.690 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7e96bf8d-16b3-4817-bcdb-b2bddd2e6d9c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 NetworkManager[48978]: <info>  [1769163589.6973] device (tap5def28f3-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:19:49 np0005593294 NetworkManager[48978]: <info>  [1769163589.6984] device (tap5def28f3-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.701 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[b830199b-6428-4046-914b-0e767a984559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.716 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:49 np0005593294 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Jan 23 05:19:49 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:49Z|00048|binding|INFO|Setting lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 ovn-installed in OVS
Jan 23 05:19:49 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:49Z|00049|binding|INFO|Setting lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 up in Southbound
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.724 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.728 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8e552358-4a9c-4d13-83b5-169617ff856a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.754 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[d6703ecd-e1e1-4f5d-a8fe-30f977622869]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 NetworkManager[48978]: <info>  [1769163589.7605] manager: (tap13f52d23-90): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.761 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[f36674fa-25eb-4411-9813-42ae764f2486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.789 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[9242e68a-21c5-4d2b-a2f4-a6afd7d2c7c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.791 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b79957-5801-484c-bdbc-fbb3e5b3d1ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 NetworkManager[48978]: <info>  [1769163589.8158] device (tap13f52d23-90): carrier: link connected
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.823 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[63e79906-d8d9-4c1d-a4c8-e1eb1c481df7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.846 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5880ef-910d-4115-b647-ed225da114d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13f52d23-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:be:4e:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474379, 'reachable_time': 29178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231098, 'error': None, 'target': 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.864 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0638fca8-0335-4dc9-a78a-532bcd7a96e7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febe:4e18'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474379, 'tstamp': 474379}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231100, 'error': None, 'target': 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:49.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.888 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9ec703-314d-42bb-8173-a826256e23dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13f52d23-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:be:4e:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474379, 'reachable_time': 29178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231101, 'error': None, 'target': 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.929 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[99521177-1fff-467d-ab2b-8d8554453608]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.934 225709 DEBUG nova.compute.manager [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.934 225709 DEBUG oslo_concurrency.lockutils [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.934 225709 DEBUG oslo_concurrency.lockutils [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.935 225709 DEBUG oslo_concurrency.lockutils [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:49 np0005593294 nova_compute[225705]: 2026-01-23 10:19:49.935 225709 DEBUG nova.compute.manager [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Processing event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.000 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[143ebb9f-deee-4e15-b00c-a0785c041a0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.003 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13f52d23-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.004 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.005 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13f52d23-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.006 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:50 np0005593294 NetworkManager[48978]: <info>  [1769163590.0074] manager: (tap13f52d23-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 23 05:19:50 np0005593294 kernel: tap13f52d23-90: entered promiscuous mode
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.013 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap13f52d23-90, col_values=(('external_ids', {'iface-id': 'bc964dd7-f70f-4a1c-80bd-e1f6bfbe809a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.014 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:50 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:50Z|00050|binding|INFO|Releasing lport bc964dd7-f70f-4a1c-80bd-e1f6bfbe809a from this chassis (sb_readonly=0)
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.016 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/13f52d23-9898-43a0-a951-b69cb2abebab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/13f52d23-9898-43a0-a951-b69cb2abebab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.016 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[a28ee66e-9576-48d8-b54b-de45bbcce73a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.017 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: global
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    log         /dev/log local0 debug
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    log-tag     haproxy-metadata-proxy-13f52d23-9898-43a0-a951-b69cb2abebab
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    user        root
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    group       root
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    maxconn     1024
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    pidfile     /var/lib/neutron/external/pids/13f52d23-9898-43a0-a951-b69cb2abebab.pid.haproxy
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    daemon
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: defaults
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    log global
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    mode http
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    option httplog
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    option dontlognull
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    option http-server-close
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    option forwardfor
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    retries                 3
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    timeout http-request    30s
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    timeout connect         30s
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    timeout client          32s
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    timeout server          32s
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    timeout http-keep-alive 30s
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: listen listener
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    bind 169.254.169.254:80
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]:    http-request add-header X-OVN-Network-ID 13f52d23-9898-43a0-a951-b69cb2abebab
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:19:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.018 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'env', 'PROCESS_TAG=haproxy-13f52d23-9898-43a0-a951-b69cb2abebab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/13f52d23-9898-43a0-a951-b69cb2abebab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.027 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.248 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.249 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163590.2477126, 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.250 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] VM Started (Lifecycle Event)#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.252 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.255 225709 INFO nova.virt.libvirt.driver [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance spawned successfully.#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.256 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.284 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.290 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.291 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.292 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.292 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.293 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.294 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.298 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.342 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.343 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163590.2479613, 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.343 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.367 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.372 225709 INFO nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Took 9.78 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.373 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.374 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163590.2521753, 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.374 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.402 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.406 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:19:50 np0005593294 podman[231176]: 2026-01-23 10:19:50.416712822 +0000 UTC m=+0.067575767 container create 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.430 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:19:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.444 225709 INFO nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Took 10.92 seconds to build instance.#033[00m
Jan 23 05:19:50 np0005593294 nova_compute[225705]: 2026-01-23 10:19:50.461 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:50 np0005593294 podman[231176]: 2026-01-23 10:19:50.374663793 +0000 UTC m=+0.025526788 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:19:50 np0005593294 systemd[1]: Started libpod-conmon-4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e.scope.
Jan 23 05:19:50 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:19:50 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/200b86ca9cf8069eb92d9df0e5692d0577d96f4ecb60bc72d1a62929a70d50cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:19:50 np0005593294 podman[231176]: 2026-01-23 10:19:50.510079003 +0000 UTC m=+0.160941968 container init 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:19:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:50.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:50 np0005593294 podman[231176]: 2026-01-23 10:19:50.516049682 +0000 UTC m=+0.166912627 container start 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:19:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:50 np0005593294 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [NOTICE]   (231195) : New worker (231198) forked
Jan 23 05:19:50 np0005593294 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [NOTICE]   (231195) : Loading success.
Jan 23 05:19:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:51 np0005593294 nova_compute[225705]: 2026-01-23 10:19:51.420 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:19:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1804.3 total, 600.0 interval#012Cumulative writes: 10K writes, 39K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 10K writes, 2684 syncs, 4.00 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1689 writes, 4700 keys, 1689 commit groups, 1.0 writes per commit group, ingest: 4.62 MB, 0.01 MB/s#012Interval WAL: 1689 writes, 725 syncs, 2.33 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:19:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:51.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.023 225709 DEBUG nova.compute.manager [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.024 225709 DEBUG oslo_concurrency.lockutils [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.024 225709 DEBUG oslo_concurrency.lockutils [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.025 225709 DEBUG oslo_concurrency.lockutils [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.025 225709 DEBUG nova.compute.manager [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] No waiting events found dispatching network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.025 225709 WARNING nova.compute.manager [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received unexpected event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:19:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:52.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14001ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:52 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:52Z|00051|binding|INFO|Releasing lport bc964dd7-f70f-4a1c-80bd-e1f6bfbe809a from this chassis (sb_readonly=0)
Jan 23 05:19:52 np0005593294 NetworkManager[48978]: <info>  [1769163592.6714] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.670 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:52 np0005593294 NetworkManager[48978]: <info>  [1769163592.6724] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 23 05:19:52 np0005593294 ovn_controller[133293]: 2026-01-23T10:19:52Z|00052|binding|INFO|Releasing lport bc964dd7-f70f-4a1c-80bd-e1f6bfbe809a from this chassis (sb_readonly=0)
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.707 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.713 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:52 np0005593294 nova_compute[225705]: 2026-01-23 10:19:52.811 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:53 np0005593294 nova_compute[225705]: 2026-01-23 10:19:53.112 225709 DEBUG nova.compute.manager [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:53 np0005593294 nova_compute[225705]: 2026-01-23 10:19:53.113 225709 DEBUG nova.compute.manager [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing instance network info cache due to event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:19:53 np0005593294 nova_compute[225705]: 2026-01-23 10:19:53.113 225709 DEBUG oslo_concurrency.lockutils [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:53 np0005593294 nova_compute[225705]: 2026-01-23 10:19:53.114 225709 DEBUG oslo_concurrency.lockutils [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:53 np0005593294 nova_compute[225705]: 2026-01-23 10:19:53.114 225709 DEBUG nova.network.neutron [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:19:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004480 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:53.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:54.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:55.048 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:55.048 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:19:55.049 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002ee0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:55 np0005593294 nova_compute[225705]: 2026-01-23 10:19:55.648 225709 DEBUG nova.network.neutron [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated VIF entry in instance network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:19:55 np0005593294 nova_compute[225705]: 2026-01-23 10:19:55.649 225709 DEBUG nova.network.neutron [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:55 np0005593294 nova_compute[225705]: 2026-01-23 10:19:55.721 225709 DEBUG oslo_concurrency.lockutils [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:55.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:56 np0005593294 nova_compute[225705]: 2026-01-23 10:19:56.422 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00044a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:56.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:57 np0005593294 nova_compute[225705]: 2026-01-23 10:19:57.815 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:57.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002ee0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:58.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00044c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:59 np0005593294 nova_compute[225705]: 2026-01-23 10:19:59.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:59 np0005593294 nova_compute[225705]: 2026-01-23 10:19:59.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:19:59 np0005593294 nova_compute[225705]: 2026-01-23 10:19:59.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:19:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:19:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:59.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:00 np0005593294 nova_compute[225705]: 2026-01-23 10:20:00.398 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:20:00 np0005593294 nova_compute[225705]: 2026-01-23 10:20:00.399 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:20:00 np0005593294 nova_compute[225705]: 2026-01-23 10:20:00.399 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:20:00 np0005593294 nova_compute[225705]: 2026-01-23 10:20:00.400 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:20:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:00.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002ee0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00044e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:01 np0005593294 nova_compute[225705]: 2026-01-23 10:20:01.424 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:01 np0005593294 ceph-mds[84630]: mds.beacon.cephfs.compute-1.bcvzvj missed beacon ack from the monitors
Jan 23 05:20:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:01.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:02 np0005593294 nova_compute[225705]: 2026-01-23 10:20:02.489 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:20:02 np0005593294 nova_compute[225705]: 2026-01-23 10:20:02.513 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:20:02 np0005593294 nova_compute[225705]: 2026-01-23 10:20:02.513 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:20:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:02.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:02 np0005593294 nova_compute[225705]: 2026-01-23 10:20:02.818 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:02 np0005593294 nova_compute[225705]: 2026-01-23 10:20:02.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:02 np0005593294 nova_compute[225705]: 2026-01-23 10:20:02.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:03 np0005593294 podman[231215]: 2026-01-23 10:20:03.681965509 +0000 UTC m=+0.085973678 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 05:20:03 np0005593294 nova_compute[225705]: 2026-01-23 10:20:03.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:03.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004500 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:04.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:04 np0005593294 nova_compute[225705]: 2026-01-23 10:20:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:04 np0005593294 nova_compute[225705]: 2026-01-23 10:20:04.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:20:05 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).paxos(paxos updating c 2009..2632) lease_timeout -- calling new election
Jan 23 05:20:05 np0005593294 ceph-mon[80126]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 23 05:20:05 np0005593294 ceph-mon[80126]: paxos.2).electionLogic(14) init, last seen epoch 14
Jan 23 05:20:05 np0005593294 ceph-mon[80126]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 05:20:05 np0005593294 ceph-mon[80126]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 05:20:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:05 np0005593294 ceph-mds[84630]: mds.beacon.cephfs.compute-1.bcvzvj missed beacon ack from the monitors
Jan 23 05:20:05 np0005593294 nova_compute[225705]: 2026-01-23 10:20:05.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:05 np0005593294 nova_compute[225705]: 2026-01-23 10:20:05.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:05 np0005593294 nova_compute[225705]: 2026-01-23 10:20:05.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:05 np0005593294 nova_compute[225705]: 2026-01-23 10:20:05.896 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:05 np0005593294 nova_compute[225705]: 2026-01-23 10:20:05.896 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:05 np0005593294 nova_compute[225705]: 2026-01-23 10:20:05.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:05 np0005593294 nova_compute[225705]: 2026-01-23 10:20:05.897 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:20:05 np0005593294 nova_compute[225705]: 2026-01-23 10:20:05.897 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:05.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:06 np0005593294 ceph-mon[80126]: mon.compute-1@2(electing) e3 handle_auth_request failed to assign global_id
Jan 23 05:20:06 np0005593294 ceph-mon[80126]: mon.compute-1@2(electing) e3 handle_auth_request failed to assign global_id
Jan 23 05:20:06 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.426 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:06.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.594 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.696s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.673 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.673 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.858 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.860 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4682MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.860 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.860 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.941 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.942 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.942 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:20:06 np0005593294 nova_compute[225705]: 2026-01-23 10:20:06.987 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040025a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:20:07 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1764227055' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:20:07 np0005593294 nova_compute[225705]: 2026-01-23 10:20:07.437 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:07 np0005593294 nova_compute[225705]: 2026-01-23 10:20:07.444 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:20:07 np0005593294 nova_compute[225705]: 2026-01-23 10:20:07.470 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:20:07 np0005593294 nova_compute[225705]: 2026-01-23 10:20:07.496 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:20:07 np0005593294 nova_compute[225705]: 2026-01-23 10:20:07.497 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:07 np0005593294 nova_compute[225705]: 2026-01-23 10:20:07.820 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:07.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:08.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004790 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:08 np0005593294 ceph-mon[80126]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:20:08 np0005593294 ceph-mon[80126]: mon.compute-1 calling monitor election
Jan 23 05:20:08 np0005593294 ceph-mon[80126]: mon.compute-0 calling monitor election
Jan 23 05:20:08 np0005593294 ceph-mon[80126]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 05:20:08 np0005593294 ceph-mon[80126]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:20:08 np0005593294 ceph-mon[80126]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:20:08 np0005593294 ceph-mon[80126]:     osd.1 observed slow operation indications in BlueStore
Jan 23 05:20:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:09 np0005593294 nova_compute[225705]: 2026-01-23 10:20:09.499 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:09 np0005593294 ovn_controller[133293]: 2026-01-23T10:20:09Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d5:a8:9e 10.100.0.8
Jan 23 05:20:09 np0005593294 ovn_controller[133293]: 2026-01-23T10:20:09Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d5:a8:9e 10.100.0.8
Jan 23 05:20:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:09.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040025a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:10.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00047b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:11 np0005593294 nova_compute[225705]: 2026-01-23 10:20:11.427 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:11.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:12.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040025a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:12 np0005593294 nova_compute[225705]: 2026-01-23 10:20:12.824 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:13 np0005593294 ceph-mon[80126]: Health check update: 2 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 23 05:20:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:20:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:13.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:20:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:14.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040025a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:15 np0005593294 nova_compute[225705]: 2026-01-23 10:20:15.866 225709 INFO nova.compute.manager [None req-f566c29b-e1ca-4f38-a548-e7924f179629 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Get console output#033[00m
Jan 23 05:20:15 np0005593294 nova_compute[225705]: 2026-01-23 10:20:15.878 230072 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:20:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:15.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:16 np0005593294 nova_compute[225705]: 2026-01-23 10:20:16.430 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:16.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00047f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:16 np0005593294 podman[231322]: 2026-01-23 10:20:16.657982764 +0000 UTC m=+0.058690186 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 05:20:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:17 np0005593294 nova_compute[225705]: 2026-01-23 10:20:17.828 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:17.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:18.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:18 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:20:18.564 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:20:18 np0005593294 nova_compute[225705]: 2026-01-23 10:20:18.565 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:18 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:20:18.565 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:20:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:18 np0005593294 nova_compute[225705]: 2026-01-23 10:20:18.678 225709 DEBUG nova.compute.manager [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:20:18 np0005593294 nova_compute[225705]: 2026-01-23 10:20:18.678 225709 DEBUG nova.compute.manager [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing instance network info cache due to event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:20:18 np0005593294 nova_compute[225705]: 2026-01-23 10:20:18.679 225709 DEBUG oslo_concurrency.lockutils [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:20:18 np0005593294 nova_compute[225705]: 2026-01-23 10:20:18.679 225709 DEBUG oslo_concurrency.lockutils [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:20:18 np0005593294 nova_compute[225705]: 2026-01-23 10:20:18.680 225709 DEBUG nova.network.neutron [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:20:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004810 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:19.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:19 np0005593294 nova_compute[225705]: 2026-01-23 10:20:19.974 225709 DEBUG nova.network.neutron [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated VIF entry in instance network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:20:19 np0005593294 nova_compute[225705]: 2026-01-23 10:20:19.974 225709 DEBUG nova.network.neutron [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:20:19 np0005593294 nova_compute[225705]: 2026-01-23 10:20:19.996 225709 DEBUG oslo_concurrency.lockutils [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:20:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:20.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:21 np0005593294 nova_compute[225705]: 2026-01-23 10:20:21.463 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:21.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 05:20:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:22.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 05:20:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:22 np0005593294 nova_compute[225705]: 2026-01-23 10:20:22.831 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:23.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:23 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:23 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:24 np0005593294 podman[231615]: 2026-01-23 10:20:24.292301086 +0000 UTC m=+0.038717115 container create 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:20:24 np0005593294 systemd[1]: Started libpod-conmon-87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30.scope.
Jan 23 05:20:24 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:20:24 np0005593294 podman[231615]: 2026-01-23 10:20:24.274907066 +0000 UTC m=+0.021323105 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:20:24 np0005593294 podman[231615]: 2026-01-23 10:20:24.385340007 +0000 UTC m=+0.131756046 container init 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 23 05:20:24 np0005593294 podman[231615]: 2026-01-23 10:20:24.397811801 +0000 UTC m=+0.144227810 container start 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Jan 23 05:20:24 np0005593294 podman[231615]: 2026-01-23 10:20:24.401523619 +0000 UTC m=+0.147939638 container attach 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 05:20:24 np0005593294 zen_blackburn[231631]: 167 167
Jan 23 05:20:24 np0005593294 systemd[1]: libpod-87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30.scope: Deactivated successfully.
Jan 23 05:20:24 np0005593294 conmon[231631]: conmon 87cd8b0a9f0eb6a87a8e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30.scope/container/memory.events
Jan 23 05:20:24 np0005593294 podman[231615]: 2026-01-23 10:20:24.413534349 +0000 UTC m=+0.159950398 container died 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:20:24 np0005593294 systemd[1]: var-lib-containers-storage-overlay-91478f3a342789530622a48e8c1e4e0cdb64828e1319f68d0c8403600efe1b51-merged.mount: Deactivated successfully.
Jan 23 05:20:24 np0005593294 podman[231615]: 2026-01-23 10:20:24.45631955 +0000 UTC m=+0.202735569 container remove 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 05:20:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:24 np0005593294 systemd[1]: libpod-conmon-87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30.scope: Deactivated successfully.
Jan 23 05:20:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:24.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:24 np0005593294 podman[231654]: 2026-01-23 10:20:24.624997372 +0000 UTC m=+0.041304936 container create 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 05:20:24 np0005593294 systemd[1]: Started libpod-conmon-1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613.scope.
Jan 23 05:20:24 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:20:24 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 05:20:24 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:20:24 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 05:20:24 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 05:20:24 np0005593294 podman[231654]: 2026-01-23 10:20:24.606988094 +0000 UTC m=+0.023295678 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:20:24 np0005593294 podman[231654]: 2026-01-23 10:20:24.711238948 +0000 UTC m=+0.127546522 container init 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 05:20:24 np0005593294 podman[231654]: 2026-01-23 10:20:24.717421564 +0000 UTC m=+0.133729148 container start 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 05:20:24 np0005593294 podman[231654]: 2026-01-23 10:20:24.721873085 +0000 UTC m=+0.138180659 container attach 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:20:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:25 np0005593294 focused_villani[231670]: [
Jan 23 05:20:25 np0005593294 focused_villani[231670]:    {
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        "available": false,
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        "being_replaced": false,
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        "ceph_device_lvm": false,
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        "lsm_data": {},
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        "lvs": [],
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        "path": "/dev/sr0",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        "rejected_reasons": [
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "Insufficient space (<5GB)",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "Has a FileSystem"
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        ],
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        "sys_api": {
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "actuators": null,
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "device_nodes": [
Jan 23 05:20:25 np0005593294 focused_villani[231670]:                "sr0"
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            ],
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "devname": "sr0",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "human_readable_size": "482.00 KB",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "id_bus": "ata",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "model": "QEMU DVD-ROM",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "nr_requests": "2",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "parent": "/dev/sr0",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "partitions": {},
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "path": "/dev/sr0",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "removable": "1",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "rev": "2.5+",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "ro": "0",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "rotational": "1",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "sas_address": "",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "sas_device_handle": "",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "scheduler_mode": "mq-deadline",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "sectors": 0,
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "sectorsize": "2048",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "size": 493568.0,
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "support_discard": "2048",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "type": "disk",
Jan 23 05:20:25 np0005593294 focused_villani[231670]:            "vendor": "QEMU"
Jan 23 05:20:25 np0005593294 focused_villani[231670]:        }
Jan 23 05:20:25 np0005593294 focused_villani[231670]:    }
Jan 23 05:20:25 np0005593294 focused_villani[231670]: ]
Jan 23 05:20:25 np0005593294 systemd[1]: libpod-1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613.scope: Deactivated successfully.
Jan 23 05:20:25 np0005593294 podman[231654]: 2026-01-23 10:20:25.571530303 +0000 UTC m=+0.987837897 container died 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 05:20:25 np0005593294 systemd[1]: var-lib-containers-storage-overlay-886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b-merged.mount: Deactivated successfully.
Jan 23 05:20:25 np0005593294 podman[231654]: 2026-01-23 10:20:25.62267265 +0000 UTC m=+1.038980254 container remove 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:20:25 np0005593294 systemd[1]: libpod-conmon-1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613.scope: Deactivated successfully.
Jan 23 05:20:25 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:25 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:25.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:26 np0005593294 nova_compute[225705]: 2026-01-23 10:20:26.466 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:26.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:27 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:20:27.568 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:27 np0005593294 nova_compute[225705]: 2026-01-23 10:20:27.835 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:27.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:28.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:29 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:29 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:29 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:20:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:29.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:30 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:20:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:30.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:31 np0005593294 nova_compute[225705]: 2026-01-23 10:20:31.467 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:31.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:32.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:32 np0005593294 nova_compute[225705]: 2026-01-23 10:20:32.837 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:33.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:34 np0005593294 podman[233068]: 2026-01-23 10:20:34.251350076 +0000 UTC m=+0.081979213 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 05:20:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:34.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:35 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:35 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:35.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:36 np0005593294 nova_compute[225705]: 2026-01-23 10:20:36.523 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:36.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:37 np0005593294 nova_compute[225705]: 2026-01-23 10:20:37.840 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:37.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00048c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14001ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:38.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:39.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00048e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:40.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14001ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:41 np0005593294 nova_compute[225705]: 2026-01-23 10:20:41.525 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:41.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:42.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:42 np0005593294 nova_compute[225705]: 2026-01-23 10:20:42.844 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004900 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:43.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14001ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:44.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:45.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:46 np0005593294 nova_compute[225705]: 2026-01-23 10:20:46.527 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:46.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:20:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 4988 writes, 27K keys, 4988 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 4988 writes, 4988 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1476 writes, 7220 keys, 1476 commit groups, 1.0 writes per commit group, ingest: 17.08 MB, 0.03 MB/s#012Interval WAL: 1476 writes, 1476 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     53.2      0.71              0.12        14    0.050       0      0       0.0       0.0#012  L6      1/0   12.23 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   4.3    120.5    104.0      1.54              0.46        13    0.119     68K   6777       0.0       0.0#012 Sum      1/0   12.23 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.3     82.7     88.1      2.25              0.57        27    0.083     68K   6777       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.3     88.6     87.9      0.83              0.20        10    0.083     29K   2602       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   0.0    120.5    104.0      1.54              0.46        13    0.119     68K   6777       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     53.4      0.70              0.12        13    0.054       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.037, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.19 GB write, 0.11 MB/s write, 0.18 GB read, 0.10 MB/s read, 2.3 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 13.52 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000233 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(719,12.97 MB,4.26751%) FilterBlock(27,201.17 KB,0.064624%) IndexBlock(27,355.48 KB,0.114195%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:20:47 np0005593294 podman[233131]: 2026-01-23 10:20:47.68502592 +0000 UTC m=+0.081290801 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:20:47 np0005593294 nova_compute[225705]: 2026-01-23 10:20:47.847 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:47.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004940 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:48.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:49.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:50.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004960 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:51 np0005593294 nova_compute[225705]: 2026-01-23 10:20:51.530 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:51.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:52.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:52 np0005593294 nova_compute[225705]: 2026-01-23 10:20:52.849 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:53.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004980 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004980 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:54.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:20:55.048 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:20:55.049 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:20:55.050 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:55.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:56 np0005593294 nova_compute[225705]: 2026-01-23 10:20:56.531 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:56.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:56 np0005593294 nova_compute[225705]: 2026-01-23 10:20:56.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:56 np0005593294 nova_compute[225705]: 2026-01-23 10:20:56.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:20:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:57 np0005593294 nova_compute[225705]: 2026-01-23 10:20:57.852 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 05:20:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:57.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 05:20:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00049a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:58.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:59 np0005593294 nova_compute[225705]: 2026-01-23 10:20:59.888 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:59 np0005593294 nova_compute[225705]: 2026-01-23 10:20:59.888 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:20:59 np0005593294 nova_compute[225705]: 2026-01-23 10:20:59.888 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:20:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:20:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:59.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.062 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.063 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.063 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.063 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.261 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.262 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.262 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.262 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.262 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.264 225709 INFO nova.compute.manager [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Terminating instance#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.265 225709 DEBUG nova.compute.manager [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:21:00 np0005593294 kernel: tap5def28f3-3b (unregistering): left promiscuous mode
Jan 23 05:21:00 np0005593294 NetworkManager[48978]: <info>  [1769163660.3267] device (tap5def28f3-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:21:00 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:00Z|00053|binding|INFO|Releasing lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 from this chassis (sb_readonly=0)
Jan 23 05:21:00 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:00Z|00054|binding|INFO|Setting lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 down in Southbound
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.343 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:00Z|00055|binding|INFO|Removing iface tap5def28f3-3b ovn-installed in OVS
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.345 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.355 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a8:9e 10.100.0.8'], port_security=['fa:16:3e:d5:a8:9e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13f52d23-9898-43a0-a951-b69cb2abebab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '077cd29b-8d1e-4ab1-b762-8cd58191c522', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30316727-f942-4d99-94ec-26d1184b5c8a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=5def28f3-3bf5-4f1f-8e37-51794dbddfc6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.357 143098 INFO neutron.agent.ovn.metadata.agent [-] Port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 in datapath 13f52d23-9898-43a0-a951-b69cb2abebab unbound from our chassis#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.359 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 13f52d23-9898-43a0-a951-b69cb2abebab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.360 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[27fe0fd6-a959-4371-b64e-50c3e31fd1f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.361 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab namespace which is not needed anymore#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.370 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593294 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 23 05:21:00 np0005593294 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 15.270s CPU time.
Jan 23 05:21:00 np0005593294 systemd-machined[194551]: Machine qemu-2-instance-00000004 terminated.
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.500 225709 INFO nova.virt.libvirt.driver [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance destroyed successfully.#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.500 225709 DEBUG nova.objects.instance [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:00 np0005593294 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [NOTICE]   (231195) : haproxy version is 2.8.14-c23fe91
Jan 23 05:21:00 np0005593294 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [NOTICE]   (231195) : path to executable is /usr/sbin/haproxy
Jan 23 05:21:00 np0005593294 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [WARNING]  (231195) : Exiting Master process...
Jan 23 05:21:00 np0005593294 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [WARNING]  (231195) : Exiting Master process...
Jan 23 05:21:00 np0005593294 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [ALERT]    (231195) : Current worker (231198) exited with code 143 (Terminated)
Jan 23 05:21:00 np0005593294 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [WARNING]  (231195) : All workers exited. Exiting... (0)
Jan 23 05:21:00 np0005593294 systemd[1]: libpod-4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e.scope: Deactivated successfully.
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.520 225709 DEBUG nova.virt.libvirt.vif [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:19:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-263648847',display_name='tempest-TestNetworkBasicOps-server-263648847',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-263648847',id=4,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD66+yyfhusLfImf1uKQDIRs2V4/o1F8Eh/Z0SqxwU6ND9wcYC22x/WBjiGE3hyxp/MbA2OwCQVvqoeXerAOWd4tdGub7VFIxVXpqt6OghLL3nU7rH27QJ0mug8wzMNOTA==',key_name='tempest-TestNetworkBasicOps-501443179',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:19:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-rvp4p09d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:19:50Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.520 225709 DEBUG nova.network.os_vif_util [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:21:00 np0005593294 podman[233182]: 2026-01-23 10:21:00.521299888 +0000 UTC m=+0.052057956 container died 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.521 225709 DEBUG nova.network.os_vif_util [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.522 225709 DEBUG os_vif [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.524 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.524 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5def28f3-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.527 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.528 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.531 225709 INFO os_vif [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b')#033[00m
Jan 23 05:21:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:00 np0005593294 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e-userdata-shm.mount: Deactivated successfully.
Jan 23 05:21:00 np0005593294 systemd[1]: var-lib-containers-storage-overlay-200b86ca9cf8069eb92d9df0e5692d0577d96f4ecb60bc72d1a62929a70d50cc-merged.mount: Deactivated successfully.
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.559 225709 DEBUG nova.compute.manager [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-unplugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.560 225709 DEBUG oslo_concurrency.lockutils [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.560 225709 DEBUG oslo_concurrency.lockutils [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.560 225709 DEBUG oslo_concurrency.lockutils [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.561 225709 DEBUG nova.compute.manager [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] No waiting events found dispatching network-vif-unplugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.561 225709 DEBUG nova.compute.manager [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-unplugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:21:00 np0005593294 podman[233182]: 2026-01-23 10:21:00.565329159 +0000 UTC m=+0.096087227 container cleanup 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:21:00 np0005593294 systemd[1]: libpod-conmon-4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e.scope: Deactivated successfully.
Jan 23 05:21:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:00.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:00 np0005593294 podman[233240]: 2026-01-23 10:21:00.628726264 +0000 UTC m=+0.042783073 container remove 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.634 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[14dbe623-67bb-422e-a20d-79807d6a3f87]: (4, ('Fri Jan 23 10:21:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab (4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e)\n4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e\nFri Jan 23 10:21:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab (4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e)\n4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.636 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[fbbef6ed-f6c7-4c36-8a2b-e829bc3c7b15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.637 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13f52d23-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.638 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593294 kernel: tap13f52d23-90: left promiscuous mode
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.650 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.653 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3a334f-e4f8-4abb-8404-41ffc775ead1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.661 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:21:00 np0005593294 nova_compute[225705]: 2026-01-23 10:21:00.662 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.666 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[62384201-185a-43ef-8cd0-6ebaf080cd07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.666 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[60a288ee-9752-48a0-a52a-bce75fc509d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.686 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[798a3144-5d55-4f2b-81f8-dbcef46b1798]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474373, 'reachable_time': 34542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233257, 'error': None, 'target': 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.688 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:21:00 np0005593294 systemd[1]: run-netns-ovnmeta\x2d13f52d23\x2d9898\x2d43a0\x2da951\x2db69cb2abebab.mount: Deactivated successfully.
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.688 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[48b40954-aaff-4177-807f-e325e7879b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.690 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:21:01 np0005593294 nova_compute[225705]: 2026-01-23 10:21:01.130 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:01 np0005593294 nova_compute[225705]: 2026-01-23 10:21:01.149 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:21:01 np0005593294 nova_compute[225705]: 2026-01-23 10:21:01.149 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:21:01 np0005593294 nova_compute[225705]: 2026-01-23 10:21:01.149 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:01 np0005593294 nova_compute[225705]: 2026-01-23 10:21:01.149 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:21:01 np0005593294 nova_compute[225705]: 2026-01-23 10:21:01.168 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:21:01 np0005593294 nova_compute[225705]: 2026-01-23 10:21:01.168 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00049c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102101 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:21:01 np0005593294 nova_compute[225705]: 2026-01-23 10:21:01.533 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:01.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.159 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.193 225709 INFO nova.virt.libvirt.driver [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Deleting instance files /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_del#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.195 225709 INFO nova.virt.libvirt.driver [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Deletion of /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_del complete#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.253 225709 INFO nova.compute.manager [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Took 1.99 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.254 225709 DEBUG oslo.service.loopingcall [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.254 225709 DEBUG nova.compute.manager [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.255 225709 DEBUG nova.network.neutron [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:21:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:02.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.641 225709 DEBUG nova.compute.manager [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.642 225709 DEBUG oslo_concurrency.lockutils [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.642 225709 DEBUG oslo_concurrency.lockutils [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.643 225709 DEBUG oslo_concurrency.lockutils [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.643 225709 DEBUG nova.compute.manager [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] No waiting events found dispatching network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.644 225709 WARNING nova.compute.manager [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received unexpected event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.679 225709 DEBUG nova.network.neutron [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.694 225709 INFO nova.compute.manager [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Took 0.44 seconds to deallocate network for instance.#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.735 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.736 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.740 225709 DEBUG nova.compute.manager [req-a3079a84-5d7e-43be-b418-60a417efb7f0 req-07dec829-c93f-4e3f-8157-f2ee5df7755e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-deleted-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:02 np0005593294 nova_compute[225705]: 2026-01-23 10:21:02.785 225709 DEBUG oslo_concurrency.processutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:21:03 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/215747545' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:03 np0005593294 nova_compute[225705]: 2026-01-23 10:21:03.261 225709 DEBUG oslo_concurrency.processutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:03 np0005593294 nova_compute[225705]: 2026-01-23 10:21:03.268 225709 DEBUG nova.compute.provider_tree [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:21:03 np0005593294 nova_compute[225705]: 2026-01-23 10:21:03.393 225709 DEBUG nova.scheduler.client.report [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:21:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:03 np0005593294 nova_compute[225705]: 2026-01-23 10:21:03.757 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:03 np0005593294 nova_compute[225705]: 2026-01-23 10:21:03.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:03 np0005593294 nova_compute[225705]: 2026-01-23 10:21:03.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:03.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:04 np0005593294 nova_compute[225705]: 2026-01-23 10:21:04.275 225709 INFO nova.scheduler.client.report [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334#033[00m
Jan 23 05:21:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00049e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:04 np0005593294 podman[233307]: 2026-01-23 10:21:04.567305363 +0000 UTC m=+0.139027226 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:21:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:04.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:04 np0005593294 nova_compute[225705]: 2026-01-23 10:21:04.750 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:05 np0005593294 nova_compute[225705]: 2026-01-23 10:21:05.557 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:05 np0005593294 nova_compute[225705]: 2026-01-23 10:21:05.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:06.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.535 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004160 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:06.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.868 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.882 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.883 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.883 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.900 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.901 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:21:06 np0005593294 nova_compute[225705]: 2026-01-23 10:21:06.902 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040028e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:21:07 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3840207814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:07 np0005593294 nova_compute[225705]: 2026-01-23 10:21:07.374 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:07 np0005593294 nova_compute[225705]: 2026-01-23 10:21:07.585 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:21:07 np0005593294 nova_compute[225705]: 2026-01-23 10:21:07.587 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4880MB free_disk=59.94269943237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:21:07 np0005593294 nova_compute[225705]: 2026-01-23 10:21:07.587 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:07 np0005593294 nova_compute[225705]: 2026-01-23 10:21:07.587 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:07 np0005593294 nova_compute[225705]: 2026-01-23 10:21:07.682 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:21:07 np0005593294 nova_compute[225705]: 2026-01-23 10:21:07.682 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:21:07 np0005593294 nova_compute[225705]: 2026-01-23 10:21:07.740 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:08.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:21:08 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3507747902' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:08 np0005593294 nova_compute[225705]: 2026-01-23 10:21:08.195 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:08 np0005593294 nova_compute[225705]: 2026-01-23 10:21:08.203 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:21:08 np0005593294 nova_compute[225705]: 2026-01-23 10:21:08.219 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:21:08 np0005593294 nova_compute[225705]: 2026-01-23 10:21:08.242 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:21:08 np0005593294 nova_compute[225705]: 2026-01-23 10:21:08.242 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:08.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:09 np0005593294 nova_compute[225705]: 2026-01-23 10:21:09.235 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:09 np0005593294 nova_compute[225705]: 2026-01-23 10:21:09.236 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:21:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:10.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:10 np0005593294 nova_compute[225705]: 2026-01-23 10:21:10.560 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:10.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:10 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:10.691 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:11 np0005593294 nova_compute[225705]: 2026-01-23 10:21:11.539 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:11 np0005593294 nova_compute[225705]: 2026-01-23 10:21:11.587 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:11 np0005593294 nova_compute[225705]: 2026-01-23 10:21:11.677 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:12.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:21:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:21:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:12.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:14.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:14.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:15 np0005593294 nova_compute[225705]: 2026-01-23 10:21:15.498 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163660.4970114, 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:15 np0005593294 nova_compute[225705]: 2026-01-23 10:21:15.499 225709 INFO nova.compute.manager [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:21:15 np0005593294 nova_compute[225705]: 2026-01-23 10:21:15.521 225709 DEBUG nova.compute.manager [None req-3d57e49d-c759-4808-b57a-b5615e9d1214 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:15 np0005593294 nova_compute[225705]: 2026-01-23 10:21:15.564 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:21:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:16.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:16 np0005593294 nova_compute[225705]: 2026-01-23 10:21:16.542 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:16.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:18.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:18.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:18 np0005593294 podman[233389]: 2026-01-23 10:21:18.669434094 +0000 UTC m=+0.074705493 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:21:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:20.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:20 np0005593294 nova_compute[225705]: 2026-01-23 10:21:20.609 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:20.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:20 np0005593294 nova_compute[225705]: 2026-01-23 10:21:20.775 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102121 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:21:21 np0005593294 nova_compute[225705]: 2026-01-23 10:21:21.545 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:22.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:22.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:24.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:24.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:25 np0005593294 nova_compute[225705]: 2026-01-23 10:21:25.614 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:26.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:26 np0005593294 nova_compute[225705]: 2026-01-23 10:21:26.547 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:26.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:28.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:28.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001670 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:30.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040011d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:30 np0005593294 nova_compute[225705]: 2026-01-23 10:21:30.616 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:30.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:31 np0005593294 nova_compute[225705]: 2026-01-23 10:21:31.549 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:32.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.140 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.141 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.164 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.235 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.236 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.248 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.248 225709 INFO nova.compute.claims [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.372 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001670 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040011d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:32.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:21:32 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3733854137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.877 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:32 np0005593294 nova_compute[225705]: 2026-01-23 10:21:32.884 225709 DEBUG nova.compute.provider_tree [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.099 225709 DEBUG nova.scheduler.client.report [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.127 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.891s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.128 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.175 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.176 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.197 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.215 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:21:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.297 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.299 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.299 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Creating image(s)#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.332 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.366 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.396 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.402 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.490 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.492 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.493 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.494 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.525 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.532 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:33 np0005593294 nova_compute[225705]: 2026-01-23 10:21:33.561 225709 DEBUG nova.policy [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:21:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:34.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001670 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:34.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:35 np0005593294 nova_compute[225705]: 2026-01-23 10:21:35.618 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:35 np0005593294 podman[233640]: 2026-01-23 10:21:35.715745761 +0000 UTC m=+0.112697263 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:21:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:36.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:36 np0005593294 nova_compute[225705]: 2026-01-23 10:21:36.551 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:36 np0005593294 nova_compute[225705]: 2026-01-23 10:21:36.620 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Successfully created port: dfaa68a5-31a2-4de5-996e-11936357ca9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:21:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:36.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001810 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:38.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.222 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.691s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.293 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.498 225709 DEBUG nova.objects.instance [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.512 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.513 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Ensure instance console log exists: /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.513 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.513 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.514 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:38.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.727 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Successfully updated port: dfaa68a5-31a2-4de5-996e-11936357ca9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.756 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.757 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.757 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:21:38 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:38 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.889 225709 DEBUG nova.compute.manager [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.890 225709 DEBUG nova.compute.manager [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:21:38 np0005593294 nova_compute[225705]: 2026-01-23 10:21:38.891 225709 DEBUG oslo_concurrency.lockutils [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:21:39 np0005593294 nova_compute[225705]: 2026-01-23 10:21:39.124 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:21:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:40.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:40 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:21:40 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:40 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:40 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.367 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.399 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.400 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance network_info: |[{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.400 225709 DEBUG oslo_concurrency.lockutils [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.401 225709 DEBUG nova.network.neutron [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.405 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start _get_guest_xml network_info=[{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encrypted': False, 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encryption_format': None, 'boot_index': 0, 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.411 225709 WARNING nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.421 225709 DEBUG nova.virt.libvirt.host [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.422 225709 DEBUG nova.virt.libvirt.host [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.425 225709 DEBUG nova.virt.libvirt.host [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.426 225709 DEBUG nova.virt.libvirt.host [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.426 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.426 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.427 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.427 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.427 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.429 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.429 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.431 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.622 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:40.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:40 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:21:40 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1204880216' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.895 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.930 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:40 np0005593294 nova_compute[225705]: 2026-01-23 10:21:40.936 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4002eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:41 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:21:41 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1226134693' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.421 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.423 225709 DEBUG nova.virt.libvirt.vif [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:21:33Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.423 225709 DEBUG nova.network.os_vif_util [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.424 225709 DEBUG nova.network.os_vif_util [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.425 225709 DEBUG nova.objects.instance [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.441 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <name>instance-00000006</name>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <memory>131072</memory>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <vcpu>1</vcpu>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <nova:creationTime>2026-01-23 10:21:40</nova:creationTime>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <nova:flavor name="m1.nano">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <nova:memory>128</nova:memory>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <nova:disk>1</nova:disk>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <nova:swap>0</nova:swap>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      </nova:flavor>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <nova:owner>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      </nova:owner>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <nova:ports>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        </nova:port>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      </nova:ports>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    </nova:instance>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <sysinfo type="smbios">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <entry name="serial">db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <entry name="uuid">db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <boot dev="hd"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <smbios mode="sysinfo"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <vmcoreinfo/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <clock offset="utc">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <timer name="hpet" present="no"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <cpu mode="host-model" match="exact">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <disk type="network" device="disk">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <driver type="raw" cache="none"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <source protocol="rbd" name="vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <auth username="openstack">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <target dev="vda" bus="virtio"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <disk type="network" device="cdrom">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <driver type="raw" cache="none"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <source protocol="rbd" name="vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <auth username="openstack">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <target dev="sda" bus="sata"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <interface type="ethernet">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <mac address="fa:16:3e:b7:90:a0"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <model type="virtio"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <mtu size="1442"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <target dev="tapdfaa68a5-31"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <serial type="pty">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <log file="/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log" append="off"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <model type="virtio"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <input type="tablet" bus="usb"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <rng model="virtio">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <controller type="usb" index="0"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    <memballoon model="virtio">
Jan 23 05:21:41 np0005593294 nova_compute[225705]:      <stats period="10"/>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:21:41 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:21:41 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:21:41 np0005593294 nova_compute[225705]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.443 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Preparing to wait for external event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.443 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.443 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.444 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.444 225709 DEBUG nova.virt.libvirt.vif [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:21:33Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.444 225709 DEBUG nova.network.os_vif_util [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.445 225709 DEBUG nova.network.os_vif_util [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.445 225709 DEBUG os_vif [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.446 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.447 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.447 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.452 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.452 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdfaa68a5-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.453 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdfaa68a5-31, col_values=(('external_ids', {'iface-id': 'dfaa68a5-31a2-4de5-996e-11936357ca9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:90:a0', 'vm-uuid': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:41 np0005593294 NetworkManager[48978]: <info>  [1769163701.4973] manager: (tapdfaa68a5-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.499 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.507 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.508 225709 INFO os_vif [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31')#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.552 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.579 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.580 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.580 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:b7:90:a0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.581 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Using config drive#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.613 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.815 225709 DEBUG nova.network.neutron [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.816 225709 DEBUG nova.network.neutron [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.842 225709 DEBUG oslo_concurrency.lockutils [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.966 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Creating config drive at /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config#033[00m
Jan 23 05:21:41 np0005593294 nova_compute[225705]: 2026-01-23 10:21:41.973 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcuy4rb25 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:42.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:42 np0005593294 nova_compute[225705]: 2026-01-23 10:21:42.115 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcuy4rb25" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:42 np0005593294 nova_compute[225705]: 2026-01-23 10:21:42.163 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:21:42 np0005593294 nova_compute[225705]: 2026-01-23 10:21:42.167 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:42.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.292 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.294 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Deleting local config drive /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config because it was imported into RBD.#033[00m
Jan 23 05:21:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:43 np0005593294 kernel: tapdfaa68a5-31: entered promiscuous mode
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.354 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:43Z|00056|binding|INFO|Claiming lport dfaa68a5-31a2-4de5-996e-11936357ca9b for this chassis.
Jan 23 05:21:43 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:43Z|00057|binding|INFO|dfaa68a5-31a2-4de5-996e-11936357ca9b: Claiming fa:16:3e:b7:90:a0 10.100.0.11
Jan 23 05:21:43 np0005593294 NetworkManager[48978]: <info>  [1769163703.3576] manager: (tapdfaa68a5-31): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.360 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.362 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.380 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:90:a0 10.100.0.11'], port_security=['fa:16:3e:b7:90:a0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8afa87f8-5b22-4350-8bf2-c7af019c3372', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dbc1781-4648-4570-b3c6-0353674ab246, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=dfaa68a5-31a2-4de5-996e-11936357ca9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.382 143098 INFO neutron.agent.ovn.metadata.agent [-] Port dfaa68a5-31a2-4de5-996e-11936357ca9b in datapath eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 bound to our chassis#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.383 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eae9e618-a7c2-43e9-ab46-9070ca2ef7f2#033[00m
Jan 23 05:21:43 np0005593294 systemd-udevd[233875]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:21:43 np0005593294 systemd-machined[194551]: New machine qemu-3-instance-00000006.
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.405 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3c00033c-26f9-4652-8664-d33119a758c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.409 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeae9e618-a1 in ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:21:43 np0005593294 NetworkManager[48978]: <info>  [1769163703.4137] device (tapdfaa68a5-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.414 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeae9e618-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.414 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[99a09d0a-c8e2-486a-b180-33c3213e5799]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 NetworkManager[48978]: <info>  [1769163703.4151] device (tapdfaa68a5-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.415 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e580114e-78e5-40e7-a2ec-b8077166c36d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.425 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:43Z|00058|binding|INFO|Setting lport dfaa68a5-31a2-4de5-996e-11936357ca9b ovn-installed in OVS
Jan 23 05:21:43 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:43Z|00059|binding|INFO|Setting lport dfaa68a5-31a2-4de5-996e-11936357ca9b up in Southbound
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.430 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.430 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[592bc69a-2f50-46e3-9902-7d3654d8c842]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.444 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[cb87e7b0-fed1-4a9c-9439-8e72a659474b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.470 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[6da9db94-9cc8-4632-9a79-b8c02ecefa30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.475 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[ce6ecdcd-ac60-4e49-9457-0314817d6176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 NetworkManager[48978]: <info>  [1769163703.4764] manager: (tapeae9e618-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.508 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[4b295205-bf2e-481e-a525-a70f7f83e392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.510 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[9c679fba-caa9-438f-b3b4-b76d780856f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 NetworkManager[48978]: <info>  [1769163703.5329] device (tapeae9e618-a0): carrier: link connected
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.538 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[3aab1e74-4f5e-4b4b-9bd1-702cfa636b09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.558 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[183d3b62-d34e-4c3b-b334-cae7766024eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeae9e618-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:bc:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485751, 'reachable_time': 35285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233910, 'error': None, 'target': 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.575 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e171f94a-5372-4239-91c9-cc966e0512fa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feee:bca2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485751, 'tstamp': 485751}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233911, 'error': None, 'target': 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.594 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[2e62c60b-001e-4044-97a4-3c186aa51f28]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeae9e618-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:bc:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485751, 'reachable_time': 35285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233912, 'error': None, 'target': 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.622 225709 DEBUG nova.compute.manager [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.624 225709 DEBUG oslo_concurrency.lockutils [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.624 225709 DEBUG oslo_concurrency.lockutils [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.624 225709 DEBUG oslo_concurrency.lockutils [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.624 225709 DEBUG nova.compute.manager [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Processing event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.624 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[391a2213-0d07-4662-93bb-afd44685f18a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.679 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[4a2d609e-27e8-42bc-96ca-a283c4bec2cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.681 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeae9e618-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.681 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.682 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeae9e618-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:43 np0005593294 kernel: tapeae9e618-a0: entered promiscuous mode
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.683 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593294 NetworkManager[48978]: <info>  [1769163703.6854] manager: (tapeae9e618-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.686 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.687 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeae9e618-a0, col_values=(('external_ids', {'iface-id': 'f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:43 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:43Z|00060|binding|INFO|Releasing lport f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba from this chassis (sb_readonly=0)
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.688 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593294 nova_compute[225705]: 2026-01-23 10:21:43.701 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.702 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eae9e618-a7c2-43e9-ab46-9070ca2ef7f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eae9e618-a7c2-43e9-ab46-9070ca2ef7f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.703 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[b6671725-223a-4350-8c25-a8d50d3c5e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.704 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: global
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    log         /dev/log local0 debug
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    log-tag     haproxy-metadata-proxy-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    user        root
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    group       root
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    maxconn     1024
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    pidfile     /var/lib/neutron/external/pids/eae9e618-a7c2-43e9-ab46-9070ca2ef7f2.pid.haproxy
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    daemon
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: defaults
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    log global
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    mode http
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    option httplog
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    option dontlognull
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    option http-server-close
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    option forwardfor
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    retries                 3
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    timeout http-request    30s
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    timeout connect         30s
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    timeout client          32s
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    timeout server          32s
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    timeout http-keep-alive 30s
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: listen listener
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    bind 169.254.169.254:80
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]:    http-request add-header X-OVN-Network-ID eae9e618-a7c2-43e9-ab46-9070ca2ef7f2
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:21:43 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.706 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'env', 'PROCESS_TAG=haproxy-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eae9e618-a7c2-43e9-ab46-9070ca2ef7f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.002 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.002 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163704.0014431, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.002 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] VM Started (Lifecycle Event)#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.008 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.012 225709 INFO nova.virt.libvirt.driver [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance spawned successfully.#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.013 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.022 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.025 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.033 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.034 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.034 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.034 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.035 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.035 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.042 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.043 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163704.0043259, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.043 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:21:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:44.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.065 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.069 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163704.0071929, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.070 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.097 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.102 225709 INFO nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Took 10.80 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.103 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.104 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.136 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.165 225709 INFO nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Took 11.96 seconds to build instance.#033[00m
Jan 23 05:21:44 np0005593294 nova_compute[225705]: 2026-01-23 10:21:44.183 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:44 np0005593294 podman[233985]: 2026-01-23 10:21:44.090848139 +0000 UTC m=+0.028274445 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:21:44 np0005593294 podman[233985]: 2026-01-23 10:21:44.437090894 +0000 UTC m=+0.374517170 container create 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 05:21:44 np0005593294 systemd[1]: Started libpod-conmon-77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee.scope.
Jan 23 05:21:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4002eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:44 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:21:44 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33d908cc39ee3158e23a819479dadd5bc8e191abccd206682925f5884fa34301/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:21:44 np0005593294 podman[233985]: 2026-01-23 10:21:44.621410251 +0000 UTC m=+0.558836557 container init 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:21:44 np0005593294 podman[233985]: 2026-01-23 10:21:44.626380558 +0000 UTC m=+0.563806834 container start 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:21:44 np0005593294 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [NOTICE]   (234054) : New worker (234056) forked
Jan 23 05:21:44 np0005593294 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [NOTICE]   (234054) : Loading success.
Jan 23 05:21:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:44.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:45 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:45 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:45 np0005593294 nova_compute[225705]: 2026-01-23 10:21:45.803 225709 DEBUG nova.compute.manager [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:45 np0005593294 nova_compute[225705]: 2026-01-23 10:21:45.804 225709 DEBUG oslo_concurrency.lockutils [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:45 np0005593294 nova_compute[225705]: 2026-01-23 10:21:45.804 225709 DEBUG oslo_concurrency.lockutils [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:45 np0005593294 nova_compute[225705]: 2026-01-23 10:21:45.804 225709 DEBUG oslo_concurrency.lockutils [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:45 np0005593294 nova_compute[225705]: 2026-01-23 10:21:45.805 225709 DEBUG nova.compute.manager [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:21:45 np0005593294 nova_compute[225705]: 2026-01-23 10:21:45.805 225709 WARNING nova.compute.manager [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b for instance with vm_state active and task_state None.#033[00m
Jan 23 05:21:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:46.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:46 np0005593294 nova_compute[225705]: 2026-01-23 10:21:46.539 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:46 np0005593294 nova_compute[225705]: 2026-01-23 10:21:46.556 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.581883) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706582073, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2403, "num_deletes": 251, "total_data_size": 6597390, "memory_usage": 6707840, "flush_reason": "Manual Compaction"}
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 23 05:21:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706627096, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4220070, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25834, "largest_seqno": 28232, "table_properties": {"data_size": 4210435, "index_size": 6065, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20450, "raw_average_key_size": 20, "raw_value_size": 4190923, "raw_average_value_size": 4207, "num_data_blocks": 262, "num_entries": 996, "num_filter_entries": 996, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163491, "oldest_key_time": 1769163491, "file_creation_time": 1769163706, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 45255 microseconds, and 9363 cpu microseconds.
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:21:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.627187) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4220070 bytes OK
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.627224) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.666112) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.666190) EVENT_LOG_v1 {"time_micros": 1769163706666177, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.666231) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6586774, prev total WAL file size 6587481, number of live WAL files 2.
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.668478) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(4121KB)], [51(12MB)]
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706668684, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17044787, "oldest_snapshot_seqno": -1}
Jan 23 05:21:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:46.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5925 keys, 14852407 bytes, temperature: kUnknown
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706914657, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 14852407, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14811516, "index_size": 24973, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14853, "raw_key_size": 150611, "raw_average_key_size": 25, "raw_value_size": 14703001, "raw_average_value_size": 2481, "num_data_blocks": 1017, "num_entries": 5925, "num_filter_entries": 5925, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163706, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.914996) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 14852407 bytes
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.916684) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 69.3 rd, 60.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.2 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(7.6) write-amplify(3.5) OK, records in: 6448, records dropped: 523 output_compression: NoCompression
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.916700) EVENT_LOG_v1 {"time_micros": 1769163706916692, "job": 30, "event": "compaction_finished", "compaction_time_micros": 246034, "compaction_time_cpu_micros": 34322, "output_level": 6, "num_output_files": 1, "total_output_size": 14852407, "num_input_records": 6448, "num_output_records": 5925, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706917588, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706920361, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.668120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920528) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:46 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:47 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:47Z|00061|binding|INFO|Releasing lport f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba from this chassis (sb_readonly=0)
Jan 23 05:21:47 np0005593294 nova_compute[225705]: 2026-01-23 10:21:47.117 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:47 np0005593294 NetworkManager[48978]: <info>  [1769163707.1180] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 23 05:21:47 np0005593294 NetworkManager[48978]: <info>  [1769163707.1191] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 23 05:21:47 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:47Z|00062|binding|INFO|Releasing lport f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba from this chassis (sb_readonly=0)
Jan 23 05:21:47 np0005593294 nova_compute[225705]: 2026-01-23 10:21:47.174 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:47 np0005593294 nova_compute[225705]: 2026-01-23 10:21:47.179 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:48.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:48.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:48 np0005593294 nova_compute[225705]: 2026-01-23 10:21:48.753 225709 DEBUG nova.compute.manager [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:48 np0005593294 nova_compute[225705]: 2026-01-23 10:21:48.754 225709 DEBUG nova.compute.manager [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:21:48 np0005593294 nova_compute[225705]: 2026-01-23 10:21:48.754 225709 DEBUG oslo_concurrency.lockutils [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:21:48 np0005593294 nova_compute[225705]: 2026-01-23 10:21:48.755 225709 DEBUG oslo_concurrency.lockutils [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:21:48 np0005593294 nova_compute[225705]: 2026-01-23 10:21:48.755 225709 DEBUG nova.network.neutron [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:21:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:49 np0005593294 podman[234069]: 2026-01-23 10:21:49.677621769 +0000 UTC m=+0.073611758 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 05:21:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:50.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:50.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:51 np0005593294 nova_compute[225705]: 2026-01-23 10:21:51.049 225709 DEBUG nova.network.neutron [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:21:51 np0005593294 nova_compute[225705]: 2026-01-23 10:21:51.050 225709 DEBUG nova.network.neutron [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:51 np0005593294 nova_compute[225705]: 2026-01-23 10:21:51.067 225709 DEBUG oslo_concurrency.lockutils [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:21:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:51 np0005593294 nova_compute[225705]: 2026-01-23 10:21:51.542 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:51 np0005593294 nova_compute[225705]: 2026-01-23 10:21:51.557 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:52.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:52.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:54.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:54.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:55.049 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:55.050 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:21:55.051 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:56.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:56 np0005593294 nova_compute[225705]: 2026-01-23 10:21:56.545 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:56 np0005593294 nova_compute[225705]: 2026-01-23 10:21:56.558 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:56.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:57 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:57Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b7:90:a0 10.100.0.11
Jan 23 05:21:57 np0005593294 ovn_controller[133293]: 2026-01-23T10:21:57Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b7:90:a0 10.100.0.11
Jan 23 05:21:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:58.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102158 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:21:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:21:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:58.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:00.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:00.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:01 np0005593294 nova_compute[225705]: 2026-01-23 10:22:01.559 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:22:01 np0005593294 nova_compute[225705]: 2026-01-23 10:22:01.561 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:22:01 np0005593294 nova_compute[225705]: 2026-01-23 10:22:01.562 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:22:01 np0005593294 nova_compute[225705]: 2026-01-23 10:22:01.562 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:22:01 np0005593294 nova_compute[225705]: 2026-01-23 10:22:01.575 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:01 np0005593294 nova_compute[225705]: 2026-01-23 10:22:01.576 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:22:01 np0005593294 nova_compute[225705]: 2026-01-23 10:22:01.947 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:01 np0005593294 nova_compute[225705]: 2026-01-23 10:22:01.947 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:22:01 np0005593294 nova_compute[225705]: 2026-01-23 10:22:01.947 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:22:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:02.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:02 np0005593294 nova_compute[225705]: 2026-01-23 10:22:02.485 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:02 np0005593294 nova_compute[225705]: 2026-01-23 10:22:02.485 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:02 np0005593294 nova_compute[225705]: 2026-01-23 10:22:02.485 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:22:02 np0005593294 nova_compute[225705]: 2026-01-23 10:22:02.486 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:02.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:03 np0005593294 nova_compute[225705]: 2026-01-23 10:22:03.271 225709 INFO nova.compute.manager [None req-c635b348-c6e2-40be-9940-9413dfd2cffe f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Get console output#033[00m
Jan 23 05:22:03 np0005593294 nova_compute[225705]: 2026-01-23 10:22:03.278 230072 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:22:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:04.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:04 np0005593294 nova_compute[225705]: 2026-01-23 10:22:04.094 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:04 np0005593294 nova_compute[225705]: 2026-01-23 10:22:04.117 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:04 np0005593294 nova_compute[225705]: 2026-01-23 10:22:04.118 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:22:04 np0005593294 nova_compute[225705]: 2026-01-23 10:22:04.118 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:22:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:04.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:22:04 np0005593294 nova_compute[225705]: 2026-01-23 10:22:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:04 np0005593294 nova_compute[225705]: 2026-01-23 10:22:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:06.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.577 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.579 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.579 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.579 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.580 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.581 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:06.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:06 np0005593294 podman[234123]: 2026-01-23 10:22:06.740717956 +0000 UTC m=+0.126096947 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.899 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.900 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.900 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.900 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:22:06 np0005593294 nova_compute[225705]: 2026-01-23 10:22:06.901 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:07 np0005593294 nova_compute[225705]: 2026-01-23 10:22:07.000 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:07 np0005593294 nova_compute[225705]: 2026-01-23 10:22:07.001 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:07 np0005593294 nova_compute[225705]: 2026-01-23 10:22:07.001 225709 DEBUG nova.objects.instance [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'flavor' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:07 np0005593294 nova_compute[225705]: 2026-01-23 10:22:07.279 225709 DEBUG nova.objects.instance [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_requests' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:22:07 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/748902986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:07 np0005593294 nova_compute[225705]: 2026-01-23 10:22:07.363 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:22:07 np0005593294 nova_compute[225705]: 2026-01-23 10:22:07.893 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:22:07 np0005593294 nova_compute[225705]: 2026-01-23 10:22:07.971 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:22:07 np0005593294 nova_compute[225705]: 2026-01-23 10:22:07.972 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:22:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:08.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.177 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.178 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4699MB free_disk=59.942752838134766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.178 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.179 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.267 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.268 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.268 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.319 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.541 225709 DEBUG nova.policy [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:22:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:08.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:22:08 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1277093325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.800 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.807 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.831 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.882 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:22:08 np0005593294 nova_compute[225705]: 2026-01-23 10:22:08.882 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:09 np0005593294 nova_compute[225705]: 2026-01-23 10:22:09.207 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Successfully created port: d372b816-e400-40ea-9e6b-cc8c21e54bc6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:22:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:09 np0005593294 nova_compute[225705]: 2026-01-23 10:22:09.883 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:09 np0005593294 nova_compute[225705]: 2026-01-23 10:22:09.884 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:09 np0005593294 nova_compute[225705]: 2026-01-23 10:22:09.884 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:09 np0005593294 nova_compute[225705]: 2026-01-23 10:22:09.885 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:09 np0005593294 nova_compute[225705]: 2026-01-23 10:22:09.886 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:22:10 np0005593294 nova_compute[225705]: 2026-01-23 10:22:10.031 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Successfully updated port: d372b816-e400-40ea-9e6b-cc8c21e54bc6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:22:10 np0005593294 nova_compute[225705]: 2026-01-23 10:22:10.050 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:10 np0005593294 nova_compute[225705]: 2026-01-23 10:22:10.051 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:10 np0005593294 nova_compute[225705]: 2026-01-23 10:22:10.051 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:22:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:10.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:10 np0005593294 nova_compute[225705]: 2026-01-23 10:22:10.145 225709 DEBUG nova.compute.manager [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:10 np0005593294 nova_compute[225705]: 2026-01-23 10:22:10.146 225709 DEBUG nova.compute.manager [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-d372b816-e400-40ea-9e6b-cc8c21e54bc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:22:10 np0005593294 nova_compute[225705]: 2026-01-23 10:22:10.147 225709 DEBUG oslo_concurrency.lockutils [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:22:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:22:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:10.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:11 np0005593294 nova_compute[225705]: 2026-01-23 10:22:11.581 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:12.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.562 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.582 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.583 225709 DEBUG oslo_concurrency.lockutils [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.584 225709 DEBUG nova.network.neutron [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port d372b816-e400-40ea-9e6b-cc8c21e54bc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.590 225709 DEBUG nova.virt.libvirt.vif [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.591 225709 DEBUG nova.network.os_vif_util [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.592 225709 DEBUG nova.network.os_vif_util [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.593 225709 DEBUG os_vif [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.594 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.595 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.596 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.605 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.605 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd372b816-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.606 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd372b816-e4, col_values=(('external_ids', {'iface-id': 'd372b816-e400-40ea-9e6b-cc8c21e54bc6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:7d:ea', 'vm-uuid': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.609 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:12 np0005593294 NetworkManager[48978]: <info>  [1769163732.6109] manager: (tapd372b816-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 23 05:22:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.613 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.621 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.623 225709 INFO os_vif [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4')#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.625 225709 DEBUG nova.virt.libvirt.vif [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.625 225709 DEBUG nova.network.os_vif_util [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.627 225709 DEBUG nova.network.os_vif_util [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.630 225709 DEBUG nova.virt.libvirt.guest [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] attach device xml: <interface type="ethernet">
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <mac address="fa:16:3e:02:7d:ea"/>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <model type="virtio"/>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <mtu size="1442"/>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <target dev="tapd372b816-e4"/>
Jan 23 05:22:12 np0005593294 nova_compute[225705]: </interface>
Jan 23 05:22:12 np0005593294 nova_compute[225705]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:22:12 np0005593294 kernel: tapd372b816-e4: entered promiscuous mode
Jan 23 05:22:12 np0005593294 NetworkManager[48978]: <info>  [1769163732.6487] manager: (tapd372b816-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.650 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:12 np0005593294 ovn_controller[133293]: 2026-01-23T10:22:12Z|00063|binding|INFO|Claiming lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 for this chassis.
Jan 23 05:22:12 np0005593294 ovn_controller[133293]: 2026-01-23T10:22:12Z|00064|binding|INFO|d372b816-e400-40ea-9e6b-cc8c21e54bc6: Claiming fa:16:3e:02:7d:ea 10.100.0.20
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.663 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:7d:ea 10.100.0.20'], port_security=['fa:16:3e:02:7d:ea 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a09a282-aa22-47cf-a68d-ce0dba493868', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dee54ab-ce3c-4b4e-ac76-15d1824a947d, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=d372b816-e400-40ea-9e6b-cc8c21e54bc6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.664 143098 INFO neutron.agent.ovn.metadata.agent [-] Port d372b816-e400-40ea-9e6b-cc8c21e54bc6 in datapath 6a09a282-aa22-47cf-a68d-ce0dba493868 bound to our chassis#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.666 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a09a282-aa22-47cf-a68d-ce0dba493868#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.682 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2987cd-91f3-48d3-bc10-886d8aa2f72a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.684 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a09a282-a1 in ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.685 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a09a282-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.686 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[db07c222-d1a2-449a-849d-3ef7aa92bd80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.687 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c2064f7f-fc77-4a6a-8950-322230437231]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002af0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.695 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:12 np0005593294 ovn_controller[133293]: 2026-01-23T10:22:12Z|00065|binding|INFO|Setting lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 ovn-installed in OVS
Jan 23 05:22:12 np0005593294 ovn_controller[133293]: 2026-01-23T10:22:12Z|00066|binding|INFO|Setting lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 up in Southbound
Jan 23 05:22:12 np0005593294 systemd-udevd[234206]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.706 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.705 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[34d01c08-8f95-4df9-8928-dc76510a2a79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 NetworkManager[48978]: <info>  [1769163732.7301] device (tapd372b816-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:22:12 np0005593294 NetworkManager[48978]: <info>  [1769163732.7308] device (tapd372b816-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.734 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ff1263-c9db-47db-8f21-c9ec94a08799]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:12.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.764 225709 DEBUG nova.virt.libvirt.driver [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.765 225709 DEBUG nova.virt.libvirt.driver [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.765 225709 DEBUG nova.virt.libvirt.driver [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:b7:90:a0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.766 225709 DEBUG nova.virt.libvirt.driver [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:02:7d:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.770 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[e820c2e7-dc0e-4068-8ade-795becf7a924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 NetworkManager[48978]: <info>  [1769163732.7769] manager: (tap6a09a282-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.776 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2c722a-99aa-4559-9a65-98b4d2f8b60d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.795 225709 DEBUG nova.virt.libvirt.guest [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:22:12</nova:creationTime>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 05:22:12 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    <nova:port uuid="d372b816-e400-40ea-9e6b-cc8c21e54bc6">
Jan 23 05:22:12 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:22:12 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:22:12 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:22:12 np0005593294 nova_compute[225705]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.819 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c135aba3-16ca-4414-9431-35c263a34d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.822 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[d063cf37-dfad-4c10-989e-c4afbbc728cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 nova_compute[225705]: 2026-01-23 10:22:12.826 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:12 np0005593294 NetworkManager[48978]: <info>  [1769163732.8486] device (tap6a09a282-a0): carrier: link connected
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.853 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[0cf0ad97-7666-4eef-8272-f06c832404ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.871 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[85df82a3-f20b-41f7-ab4f-1fab18b4116a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a09a282-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:9b:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488683, 'reachable_time': 26349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234231, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.894 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[231545f6-db11-436c-b63b-90daae1d3198]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:9ba3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488683, 'tstamp': 488683}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234232, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.919 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9c42c9-107e-41d1-8310-f499ee3f10fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a09a282-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:9b:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488683, 'reachable_time': 26349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234233, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:12 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.961 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[ab509599-f2e0-4648-b061-710b6b68f5f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.040 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[ba85d32a-5df3-4dc0-91c7-e08c6dd76f01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.043 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a09a282-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.043 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.044 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a09a282-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:13 np0005593294 NetworkManager[48978]: <info>  [1769163733.0480] manager: (tap6a09a282-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 23 05:22:13 np0005593294 kernel: tap6a09a282-a0: entered promiscuous mode
Jan 23 05:22:13 np0005593294 nova_compute[225705]: 2026-01-23 10:22:13.047 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:13 np0005593294 nova_compute[225705]: 2026-01-23 10:22:13.051 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.052 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a09a282-a0, col_values=(('external_ids', {'iface-id': 'f3eaa8c6-94ad-445d-ab48-59e26f30c078'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:13 np0005593294 ovn_controller[133293]: 2026-01-23T10:22:13Z|00067|binding|INFO|Releasing lport f3eaa8c6-94ad-445d-ab48-59e26f30c078 from this chassis (sb_readonly=0)
Jan 23 05:22:13 np0005593294 nova_compute[225705]: 2026-01-23 10:22:13.054 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:13 np0005593294 nova_compute[225705]: 2026-01-23 10:22:13.080 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:13 np0005593294 nova_compute[225705]: 2026-01-23 10:22:13.084 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.085 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.086 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7097759a-b434-4c3e-89ba-bc5122c140d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.088 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: global
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    log         /dev/log local0 debug
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    log-tag     haproxy-metadata-proxy-6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    user        root
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    group       root
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    maxconn     1024
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    pidfile     /var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    daemon
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: defaults
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    log global
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    mode http
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    option httplog
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    option dontlognull
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    option http-server-close
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    option forwardfor
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    retries                 3
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    timeout http-request    30s
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    timeout connect         30s
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    timeout client          32s
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    timeout server          32s
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    timeout http-keep-alive 30s
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: listen listener
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    bind 169.254.169.254:80
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]:    http-request add-header X-OVN-Network-ID 6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:22:13 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.089 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'env', 'PROCESS_TAG=haproxy-6a09a282-aa22-47cf-a68d-ce0dba493868', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a09a282-aa22-47cf-a68d-ce0dba493868.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:22:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:13 np0005593294 podman[234266]: 2026-01-23 10:22:13.519780464 +0000 UTC m=+0.067680610 container create 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:22:13 np0005593294 systemd[1]: Started libpod-conmon-30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315.scope.
Jan 23 05:22:13 np0005593294 podman[234266]: 2026-01-23 10:22:13.48390461 +0000 UTC m=+0.031804846 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:22:13 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:22:13 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44d1fb66475eda2601ea35994a86f5afc04dbdd9f74d7699ab0867a140ff5c5c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:13 np0005593294 podman[234266]: 2026-01-23 10:22:13.636313288 +0000 UTC m=+0.184213454 container init 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:22:13 np0005593294 podman[234266]: 2026-01-23 10:22:13.646240332 +0000 UTC m=+0.194140518 container start 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:22:13 np0005593294 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [NOTICE]   (234286) : New worker (234288) forked
Jan 23 05:22:13 np0005593294 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [NOTICE]   (234286) : Loading success.
Jan 23 05:22:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:22:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:14.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:14 np0005593294 ovn_controller[133293]: 2026-01-23T10:22:14Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:02:7d:ea 10.100.0.20
Jan 23 05:22:14 np0005593294 ovn_controller[133293]: 2026-01-23T10:22:14Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:02:7d:ea 10.100.0.20
Jan 23 05:22:14 np0005593294 nova_compute[225705]: 2026-01-23 10:22:14.602 225709 DEBUG nova.compute.manager [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:14 np0005593294 nova_compute[225705]: 2026-01-23 10:22:14.603 225709 DEBUG oslo_concurrency.lockutils [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:14 np0005593294 nova_compute[225705]: 2026-01-23 10:22:14.603 225709 DEBUG oslo_concurrency.lockutils [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:14 np0005593294 nova_compute[225705]: 2026-01-23 10:22:14.603 225709 DEBUG oslo_concurrency.lockutils [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:14 np0005593294 nova_compute[225705]: 2026-01-23 10:22:14.603 225709 DEBUG nova.compute.manager [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:14 np0005593294 nova_compute[225705]: 2026-01-23 10:22:14.604 225709 WARNING nova.compute.manager [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:22:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:14.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002af0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:16.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:16 np0005593294 nova_compute[225705]: 2026-01-23 10:22:16.296 225709 DEBUG nova.network.neutron [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port d372b816-e400-40ea-9e6b-cc8c21e54bc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:22:16 np0005593294 nova_compute[225705]: 2026-01-23 10:22:16.297 225709 DEBUG nova.network.neutron [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:16 np0005593294 nova_compute[225705]: 2026-01-23 10:22:16.311 225709 DEBUG oslo_concurrency.lockutils [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:16 np0005593294 nova_compute[225705]: 2026-01-23 10:22:16.584 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:16.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:17 np0005593294 nova_compute[225705]: 2026-01-23 10:22:17.024 225709 DEBUG nova.compute.manager [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:17 np0005593294 nova_compute[225705]: 2026-01-23 10:22:17.024 225709 DEBUG oslo_concurrency.lockutils [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:17 np0005593294 nova_compute[225705]: 2026-01-23 10:22:17.025 225709 DEBUG oslo_concurrency.lockutils [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:17 np0005593294 nova_compute[225705]: 2026-01-23 10:22:17.025 225709 DEBUG oslo_concurrency.lockutils [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:17 np0005593294 nova_compute[225705]: 2026-01-23 10:22:17.025 225709 DEBUG nova.compute.manager [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:17 np0005593294 nova_compute[225705]: 2026-01-23 10:22:17.025 225709 WARNING nova.compute.manager [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:22:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:17 np0005593294 nova_compute[225705]: 2026-01-23 10:22:17.610 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:18.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:18.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:20.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102220 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:22:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:20 np0005593294 podman[234300]: 2026-01-23 10:22:20.663772039 +0000 UTC m=+0.063431756 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:22:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:20.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:21 np0005593294 nova_compute[225705]: 2026-01-23 10:22:21.586 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:22.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:22 np0005593294 nova_compute[225705]: 2026-01-23 10:22:22.613 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:22.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:24.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:24 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:24.263 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:24 np0005593294 nova_compute[225705]: 2026-01-23 10:22:24.264 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:24 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:24.265 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:22:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:24.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:26.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:26 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:26.268 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:26 np0005593294 nova_compute[225705]: 2026-01-23 10:22:26.589 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:26.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:27 np0005593294 nova_compute[225705]: 2026-01-23 10:22:27.615 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:28.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:28 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:28.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:30.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:30 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:30.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:31 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:31 np0005593294 nova_compute[225705]: 2026-01-23 10:22:31.592 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:32.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:32 np0005593294 nova_compute[225705]: 2026-01-23 10:22:32.618 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:32.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:34.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:34 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:34.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:36.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:36 np0005593294 nova_compute[225705]: 2026-01-23 10:22:36.595 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:36 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:36.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:37 np0005593294 nova_compute[225705]: 2026-01-23 10:22:37.621 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:37 np0005593294 podman[234356]: 2026-01-23 10:22:37.693667439 +0000 UTC m=+0.096563583 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:22:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:38.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:38 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:38.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:40.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:40 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:41 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:41 np0005593294 nova_compute[225705]: 2026-01-23 10:22:41.598 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:42.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:42 np0005593294 nova_compute[225705]: 2026-01-23 10:22:42.624 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:42.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:43 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:44.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:44 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:44.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:45 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:22:45 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:22:45 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:22:45 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:22:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:46.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:46 np0005593294 nova_compute[225705]: 2026-01-23 10:22:46.603 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:46 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:46.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:47 np0005593294 nova_compute[225705]: 2026-01-23 10:22:47.626 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:48.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:48 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:48.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:49 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:49 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:22:49 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:22:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:50.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102250 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:22:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:50 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:50.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:51 np0005593294 nova_compute[225705]: 2026-01-23 10:22:51.605 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:51 np0005593294 podman[234523]: 2026-01-23 10:22:51.656116578 +0000 UTC m=+0.053381828 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 05:22:51 np0005593294 nova_compute[225705]: 2026-01-23 10:22:51.967 225709 DEBUG nova.compute.manager [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:51 np0005593294 nova_compute[225705]: 2026-01-23 10:22:51.968 225709 DEBUG nova.compute.manager [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-d372b816-e400-40ea-9e6b-cc8c21e54bc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:22:51 np0005593294 nova_compute[225705]: 2026-01-23 10:22:51.969 225709 DEBUG oslo_concurrency.lockutils [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:51 np0005593294 nova_compute[225705]: 2026-01-23 10:22:51.969 225709 DEBUG oslo_concurrency.lockutils [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:51 np0005593294 nova_compute[225705]: 2026-01-23 10:22:51.969 225709 DEBUG nova.network.neutron [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port d372b816-e400-40ea-9e6b-cc8c21e54bc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:22:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:52.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:52 np0005593294 nova_compute[225705]: 2026-01-23 10:22:52.629 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:52.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:53 np0005593294 nova_compute[225705]: 2026-01-23 10:22:53.691 225709 DEBUG nova.network.neutron [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port d372b816-e400-40ea-9e6b-cc8c21e54bc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:22:53 np0005593294 nova_compute[225705]: 2026-01-23 10:22:53.691 225709 DEBUG nova.network.neutron [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:53 np0005593294 nova_compute[225705]: 2026-01-23 10:22:53.713 225709 DEBUG oslo_concurrency.lockutils [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:54.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:54.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:55.051 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:55.052 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:22:55.053 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:56.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:56 np0005593294 nova_compute[225705]: 2026-01-23 10:22:56.607 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:22:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:56.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:22:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:57 np0005593294 nova_compute[225705]: 2026-01-23 10:22:57.633 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:58.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:22:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:58.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:23:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:00.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:00.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:01 np0005593294 nova_compute[225705]: 2026-01-23 10:23:01.609 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.310 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-d372b816-e400-40ea-9e6b-cc8c21e54bc6" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.311 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-d372b816-e400-40ea-9e6b-cc8c21e54bc6" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.328 225709 DEBUG nova.objects.instance [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'flavor' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.349 225709 DEBUG nova.virt.libvirt.vif [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.350 225709 DEBUG nova.network.os_vif_util [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.350 225709 DEBUG nova.network.os_vif_util [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:02.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.355 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.359 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.362 225709 DEBUG nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Attempting to detach device tapd372b816-e4 from instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.363 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] detach device xml: <interface type="ethernet">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <mac address="fa:16:3e:02:7d:ea"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <model type="virtio"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <mtu size="1442"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <target dev="tapd372b816-e4"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: </interface>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.370 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.374 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <name>instance-00000006</name>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:22:12</nova:creationTime>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:port uuid="d372b816-e400-40ea-9e6b-cc8c21e54bc6">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <memory unit='KiB'>131072</memory>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <resource>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <partition>/machine</partition>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </resource>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <sysinfo type='smbios'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='serial'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='uuid'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <boot dev='hd'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <smbios mode='sysinfo'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <vmcoreinfo state='on'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <vendor>AMD</vendor>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='x2apic'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc-deadline'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc_adjust'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='spec-ctrl'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='stibp'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='ssbd'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='cmp_legacy'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='overflow-recov'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='succor'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='ibrs'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='amd-ssbd'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='virt-ssbd'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='lbrv'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='tsc-scale'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='vmcb-clean'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='flushbyasid'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pause-filter'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pfthreshold'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='xsaves'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svm'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='topoext'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='npt'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='nrip-save'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <clock offset='utc'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <timer name='hpet' present='no'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <on_reboot>restart</on_reboot>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <on_crash>destroy</on_crash>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <disk type='network' device='disk'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk' index='2'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target dev='vda' bus='virtio'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='virtio-disk0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <disk type='network' device='cdrom'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config' index='1'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target dev='sda' bus='sata'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <readonly/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='sata0-0-0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pcie.0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='1' port='0x10'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='2' port='0x11'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='3' port='0x12'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.3'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='4' port='0x13'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.4'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='5' port='0x14'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.5'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='6' port='0x15'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.6'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='7' port='0x16'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.7'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='8' port='0x17'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.8'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='9' port='0x18'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.9'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='10' port='0x19'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.10'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='11' port='0x1a'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.11'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='12' port='0x1b'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.12'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='13' port='0x1c'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.13'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='14' port='0x1d'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.14'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='15' port='0x1e'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.15'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='16' port='0x1f'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.16'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='17' port='0x20'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.17'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='18' port='0x21'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.18'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='19' port='0x22'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.19'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='20' port='0x23'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.20'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='21' port='0x24'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.21'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='22' port='0x25'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.22'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='23' port='0x26'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.23'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='24' port='0x27'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.24'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='25' port='0x28'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.25'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-pci-bridge'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.26'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='usb'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='sata' index='0'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='ide'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:b7:90:a0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target dev='tapdfaa68a5-31'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='net0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:02:7d:ea'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target dev='tapd372b816-e4'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='net1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <serial type='pty'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target type='isa-serial' port='0'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <model name='isa-serial'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </target>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target type='serial' port='0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <input type='tablet' bus='usb'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='input0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <input type='mouse' bus='ps2'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='input1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <input type='keyboard' bus='ps2'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='input2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <listen type='address' address='::0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <audio id='1' type='none'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='video0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <watchdog model='itco' action='reset'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='watchdog0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </watchdog>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <memballoon model='virtio'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <stats period='10'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='balloon0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <rng model='virtio'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='rng0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <label>system_u:system_r:svirt_t:s0:c125,c442</label>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c125,c442</imagelabel>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <label>+107:+107</label>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.375 225709 INFO nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully detached device tapd372b816-e4 from instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 from the persistent domain config.#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.376 225709 DEBUG nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] (1/8): Attempting to detach device tapd372b816-e4 with device alias net1 from instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.376 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] detach device xml: <interface type="ethernet">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <mac address="fa:16:3e:02:7d:ea"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <model type="virtio"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <mtu size="1442"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <target dev="tapd372b816-e4"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: </interface>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:23:02 np0005593294 kernel: tapd372b816-e4 (unregistering): left promiscuous mode
Jan 23 05:23:02 np0005593294 NetworkManager[48978]: <info>  [1769163782.4877] device (tapd372b816-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.500 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593294 ovn_controller[133293]: 2026-01-23T10:23:02Z|00068|binding|INFO|Releasing lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 from this chassis (sb_readonly=0)
Jan 23 05:23:02 np0005593294 ovn_controller[133293]: 2026-01-23T10:23:02Z|00069|binding|INFO|Setting lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 down in Southbound
Jan 23 05:23:02 np0005593294 ovn_controller[133293]: 2026-01-23T10:23:02Z|00070|binding|INFO|Removing iface tapd372b816-e4 ovn-installed in OVS
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.503 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.505 225709 DEBUG nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Received event <DeviceRemovedEvent: 1769163782.504969, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.507 225709 DEBUG nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Start waiting for the detach event from libvirt for device tapd372b816-e4 with device alias net1 for instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.508 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.510 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:7d:ea 10.100.0.20', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a09a282-aa22-47cf-a68d-ce0dba493868', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dee54ab-ce3c-4b4e-ac76-15d1824a947d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=d372b816-e400-40ea-9e6b-cc8c21e54bc6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.512 143098 INFO neutron.agent.ovn.metadata.agent [-] Port d372b816-e400-40ea-9e6b-cc8c21e54bc6 in datapath 6a09a282-aa22-47cf-a68d-ce0dba493868 unbound from our chassis#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.513 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a09a282-aa22-47cf-a68d-ce0dba493868, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.515 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <name>instance-00000006</name>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:22:12</nova:creationTime>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:port uuid="d372b816-e400-40ea-9e6b-cc8c21e54bc6">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <memory unit='KiB'>131072</memory>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <resource>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <partition>/machine</partition>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </resource>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <sysinfo type='smbios'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='serial'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='uuid'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <boot dev='hd'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <smbios mode='sysinfo'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <vmcoreinfo state='on'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <vendor>AMD</vendor>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='x2apic'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc-deadline'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc_adjust'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='spec-ctrl'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='stibp'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='ssbd'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='cmp_legacy'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='overflow-recov'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='succor'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='ibrs'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='amd-ssbd'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='virt-ssbd'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='lbrv'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='tsc-scale'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='vmcb-clean'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='flushbyasid'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pause-filter'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pfthreshold'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='xsaves'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svm'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='require' name='topoext'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='npt'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <feature policy='disable' name='nrip-save'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <clock offset='utc'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <timer name='hpet' present='no'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <on_reboot>restart</on_reboot>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <on_crash>destroy</on_crash>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <disk type='network' device='disk'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk' index='2'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target dev='vda' bus='virtio'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='virtio-disk0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <disk type='network' device='cdrom'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config' index='1'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target dev='sda' bus='sata'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <readonly/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='sata0-0-0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pcie.0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='1' port='0x10'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='2' port='0x11'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='3' port='0x12'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.3'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='4' port='0x13'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.4'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='5' port='0x14'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.5'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='6' port='0x15'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.6'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='7' port='0x16'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.7'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='8' port='0x17'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.8'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='9' port='0x18'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.9'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='10' port='0x19'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.10'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='11' port='0x1a'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.11'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='12' port='0x1b'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.12'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='13' port='0x1c'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.13'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='14' port='0x1d'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.14'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='15' port='0x1e'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.15'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='16' port='0x1f'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.16'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='17' port='0x20'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.17'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='18' port='0x21'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.18'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='19' port='0x22'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.19'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='20' port='0x23'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.20'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='21' port='0x24'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.21'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='22' port='0x25'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.22'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='23' port='0x26'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.23'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='24' port='0x27'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.24'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target chassis='25' port='0x28'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.25'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model name='pcie-pci-bridge'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='pci.26'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='usb'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <controller type='sata' index='0'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='ide'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:b7:90:a0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target dev='tapdfaa68a5-31'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='net0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <serial type='pty'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target type='isa-serial' port='0'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:        <model name='isa-serial'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      </target>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <target type='serial' port='0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <input type='tablet' bus='usb'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='input0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <input type='mouse' bus='ps2'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='input1'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <input type='keyboard' bus='ps2'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='input2'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <listen type='address' address='::0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <audio id='1' type='none'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='video0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <watchdog model='itco' action='reset'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='watchdog0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </watchdog>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <memballoon model='virtio'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <stats period='10'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='balloon0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <rng model='virtio'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <alias name='rng0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <label>system_u:system_r:svirt_t:s0:c125,c442</label>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c125,c442</imagelabel>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <label>+107:+107</label>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.515 225709 INFO nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully detached device tapd372b816-e4 from instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 from the live domain config.#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.516 225709 DEBUG nova.virt.libvirt.vif [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.516 225709 DEBUG nova.network.os_vif_util [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.517 225709 DEBUG nova.network.os_vif_util [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.517 225709 DEBUG os_vif [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.516 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[9cfe9cd4-f62f-4027-9bf9-7e5ba7bb2e2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.519 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.520 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd372b816-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.520 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 namespace which is not needed anymore#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.521 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.521 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.523 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.526 225709 INFO os_vif [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4')#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.527 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:23:02</nova:creationTime>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 05:23:02 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:23:02 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:23:02 np0005593294 nova_compute[225705]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 05:23:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:02 np0005593294 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [NOTICE]   (234286) : haproxy version is 2.8.14-c23fe91
Jan 23 05:23:02 np0005593294 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [NOTICE]   (234286) : path to executable is /usr/sbin/haproxy
Jan 23 05:23:02 np0005593294 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [WARNING]  (234286) : Exiting Master process...
Jan 23 05:23:02 np0005593294 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [WARNING]  (234286) : Exiting Master process...
Jan 23 05:23:02 np0005593294 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [ALERT]    (234286) : Current worker (234288) exited with code 143 (Terminated)
Jan 23 05:23:02 np0005593294 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [WARNING]  (234286) : All workers exited. Exiting... (0)
Jan 23 05:23:02 np0005593294 systemd[1]: libpod-30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315.scope: Deactivated successfully.
Jan 23 05:23:02 np0005593294 podman[234571]: 2026-01-23 10:23:02.68008732 +0000 UTC m=+0.048972678 container died 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:23:02 np0005593294 systemd[1]: var-lib-containers-storage-overlay-44d1fb66475eda2601ea35994a86f5afc04dbdd9f74d7699ab0867a140ff5c5c-merged.mount: Deactivated successfully.
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.711 225709 DEBUG nova.compute.manager [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-unplugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:02 np0005593294 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315-userdata-shm.mount: Deactivated successfully.
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.712 225709 DEBUG oslo_concurrency.lockutils [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.713 225709 DEBUG oslo_concurrency.lockutils [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.713 225709 DEBUG oslo_concurrency.lockutils [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.713 225709 DEBUG nova.compute.manager [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-unplugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.713 225709 WARNING nova.compute.manager [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-unplugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:23:02 np0005593294 podman[234571]: 2026-01-23 10:23:02.718942088 +0000 UTC m=+0.087827476 container cleanup 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 05:23:02 np0005593294 systemd[1]: libpod-conmon-30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315.scope: Deactivated successfully.
Jan 23 05:23:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:02 np0005593294 podman[234601]: 2026-01-23 10:23:02.793480265 +0000 UTC m=+0.047473162 container remove 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.802 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0ffad06e-1231-4312-871d-dd910e56b1cc]: (4, ('Fri Jan 23 10:23:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 (30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315)\n30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315\nFri Jan 23 10:23:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 (30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315)\n30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.804 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[a5020298-920b-4b2e-99c2-eaeb005d1993]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.806 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a09a282-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:02 np0005593294 kernel: tap6a09a282-a0: left promiscuous mode
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.808 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593294 nova_compute[225705]: 2026-01-23 10:23:02.827 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:02.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.831 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[77143509-cd5a-4625-b737-ff322b64cbd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.848 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e37facdb-ab54-46f6-a1e6-26e13b7fe821]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.849 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[2d026b22-a096-46ae-a4e7-e4c0aa627e89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.873 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d89314c4-bc78-413b-8619-0d11817a08d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488674, 'reachable_time': 33027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234616, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.875 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:23:02 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.876 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ac2f4b-eb5b-4eee-b935-e319b98de284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:02 np0005593294 systemd[1]: run-netns-ovnmeta\x2d6a09a282\x2daa22\x2d47cf\x2da68d\x2dce0dba493868.mount: Deactivated successfully.
Jan 23 05:23:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:23:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:23:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:03 np0005593294 nova_compute[225705]: 2026-01-23 10:23:03.872 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:03 np0005593294 nova_compute[225705]: 2026-01-23 10:23:03.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:03 np0005593294 nova_compute[225705]: 2026-01-23 10:23:03.873 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:23:03 np0005593294 nova_compute[225705]: 2026-01-23 10:23:03.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:23:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:04.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.564 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.564 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.565 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.565 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:04.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.853 225709 DEBUG nova.compute.manager [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.854 225709 DEBUG oslo_concurrency.lockutils [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.854 225709 DEBUG oslo_concurrency.lockutils [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.854 225709 DEBUG oslo_concurrency.lockutils [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.854 225709 DEBUG nova.compute.manager [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:04 np0005593294 nova_compute[225705]: 2026-01-23 10:23:04.855 225709 WARNING nova.compute.manager [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:23:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:05 np0005593294 nova_compute[225705]: 2026-01-23 10:23:05.686 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:23:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:06.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:06 np0005593294 nova_compute[225705]: 2026-01-23 10:23:06.612 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:06.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:07 np0005593294 nova_compute[225705]: 2026-01-23 10:23:07.523 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:07 np0005593294 nova_compute[225705]: 2026-01-23 10:23:07.541 225709 DEBUG nova.compute.manager [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-deleted-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:07 np0005593294 nova_compute[225705]: 2026-01-23 10:23:07.541 225709 INFO nova.compute.manager [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Neutron deleted interface d372b816-e400-40ea-9e6b-cc8c21e54bc6; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:23:07 np0005593294 nova_compute[225705]: 2026-01-23 10:23:07.541 225709 DEBUG nova.network.neutron [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:07 np0005593294 nova_compute[225705]: 2026-01-23 10:23:07.930 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.117 225709 DEBUG nova.objects.instance [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lazy-loading 'system_metadata' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:08.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:08 np0005593294 podman[234645]: 2026-01-23 10:23:08.719456618 +0000 UTC m=+0.113703100 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:23:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.828 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.829 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.829 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.829 225709 DEBUG nova.network.neutron [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.830 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.831 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.832 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.832 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.832 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:08.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:08 np0005593294 nova_compute[225705]: 2026-01-23 10:23:08.869 225709 DEBUG nova.objects.instance [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lazy-loading 'flavor' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:09 np0005593294 ovn_controller[133293]: 2026-01-23T10:23:09Z|00071|binding|INFO|Releasing lport f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba from this chassis (sb_readonly=0)
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.087 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.245 225709 DEBUG nova.virt.libvirt.vif [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.246 225709 DEBUG nova.network.os_vif_util [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.247 225709 DEBUG nova.network.os_vif_util [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.251 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.253 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.253 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.254 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.254 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.254 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.282 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <name>instance-00000006</name>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:23:02</nova:creationTime>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:23:09 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <memory unit='KiB'>131072</memory>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <resource>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <partition>/machine</partition>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </resource>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <sysinfo type='smbios'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='serial'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='uuid'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <boot dev='hd'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <smbios mode='sysinfo'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <vmcoreinfo state='on'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <vendor>AMD</vendor>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='x2apic'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc-deadline'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc_adjust'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='spec-ctrl'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='stibp'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='ssbd'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='cmp_legacy'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='overflow-recov'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='succor'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='ibrs'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='amd-ssbd'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='virt-ssbd'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='lbrv'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='tsc-scale'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='vmcb-clean'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='flushbyasid'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pause-filter'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pfthreshold'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='xsaves'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svm'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='topoext'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='npt'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='nrip-save'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <clock offset='utc'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <timer name='hpet' present='no'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <on_reboot>restart</on_reboot>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <on_crash>destroy</on_crash>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <disk type='network' device='disk'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk' index='2'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target dev='vda' bus='virtio'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='virtio-disk0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <disk type='network' device='cdrom'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config' index='1'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target dev='sda' bus='sata'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <readonly/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='sata0-0-0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pcie.0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='1' port='0x10'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='2' port='0x11'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='3' port='0x12'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.3'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='4' port='0x13'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.4'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='5' port='0x14'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.5'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='6' port='0x15'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.6'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='7' port='0x16'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.7'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='8' port='0x17'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.8'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='9' port='0x18'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.9'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='10' port='0x19'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.10'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='11' port='0x1a'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.11'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='12' port='0x1b'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.12'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='13' port='0x1c'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.13'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='14' port='0x1d'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.14'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='15' port='0x1e'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.15'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='16' port='0x1f'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.16'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='17' port='0x20'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.17'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='18' port='0x21'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.18'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='19' port='0x22'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.19'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='20' port='0x23'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.20'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='21' port='0x24'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.21'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='22' port='0x25'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.22'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='23' port='0x26'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.23'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='24' port='0x27'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.24'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='25' port='0x28'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.25'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-pci-bridge'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.26'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='usb'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='sata' index='0'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='ide'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:b7:90:a0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target dev='tapdfaa68a5-31'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='net0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <serial type='pty'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target type='isa-serial' port='0'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <model name='isa-serial'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </target>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target type='serial' port='0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <input type='tablet' bus='usb'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='input0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <input type='mouse' bus='ps2'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='input1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <input type='keyboard' bus='ps2'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='input2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <listen type='address' address='::0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <audio id='1' type='none'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='video0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <watchdog model='itco' action='reset'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='watchdog0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </watchdog>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <memballoon model='virtio'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <stats period='10'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='balloon0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <rng model='virtio'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='rng0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <label>system_u:system_r:svirt_t:s0:c125,c442</label>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c125,c442</imagelabel>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <label>+107:+107</label>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:23:09 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:23:09 np0005593294 nova_compute[225705]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.284 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.296 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <name>instance-00000006</name>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:23:02</nova:creationTime>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:23:09 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <memory unit='KiB'>131072</memory>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <resource>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <partition>/machine</partition>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </resource>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <sysinfo type='smbios'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='serial'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='uuid'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <boot dev='hd'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <smbios mode='sysinfo'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <vmcoreinfo state='on'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <vendor>AMD</vendor>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='x2apic'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc-deadline'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='tsc_adjust'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='spec-ctrl'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='stibp'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='ssbd'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='cmp_legacy'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='overflow-recov'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='succor'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='ibrs'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='amd-ssbd'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='virt-ssbd'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='lbrv'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='tsc-scale'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='vmcb-clean'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='flushbyasid'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pause-filter'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='pfthreshold'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svme-addr-chk'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='xsaves'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='svm'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='require' name='topoext'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='npt'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <feature policy='disable' name='nrip-save'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <clock offset='utc'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <timer name='hpet' present='no'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <on_reboot>restart</on_reboot>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <on_crash>destroy</on_crash>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <disk type='network' device='disk'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk' index='2'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target dev='vda' bus='virtio'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='virtio-disk0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <disk type='network' device='cdrom'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <auth username='openstack'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config' index='1'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target dev='sda' bus='sata'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <readonly/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='sata0-0-0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pcie.0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='1' port='0x10'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='2' port='0x11'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='3' port='0x12'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.3'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='4' port='0x13'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.4'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='5' port='0x14'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.5'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='6' port='0x15'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.6'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='7' port='0x16'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.7'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='8' port='0x17'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.8'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='9' port='0x18'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.9'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='10' port='0x19'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.10'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='11' port='0x1a'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.11'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='12' port='0x1b'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.12'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='13' port='0x1c'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.13'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='14' port='0x1d'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.14'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='15' port='0x1e'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.15'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='16' port='0x1f'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.16'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='17' port='0x20'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.17'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='18' port='0x21'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.18'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='19' port='0x22'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.19'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='20' port='0x23'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.20'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='21' port='0x24'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.21'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='22' port='0x25'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.22'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='23' port='0x26'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.23'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='24' port='0x27'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.24'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-root-port'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target chassis='25' port='0x28'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.25'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model name='pcie-pci-bridge'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='pci.26'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='usb'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <controller type='sata' index='0'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='ide'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </controller>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <interface type='ethernet'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <mac address='fa:16:3e:b7:90:a0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target dev='tapdfaa68a5-31'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model type='virtio'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <mtu size='1442'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='net0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <serial type='pty'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target type='isa-serial' port='0'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:        <model name='isa-serial'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      </target>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <source path='/dev/pts/0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <target type='serial' port='0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='serial0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </console>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <input type='tablet' bus='usb'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='input0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <input type='mouse' bus='ps2'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='input1'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <input type='keyboard' bus='ps2'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='input2'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </input>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <listen type='address' address='::0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </graphics>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <audio id='1' type='none'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='video0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <watchdog model='itco' action='reset'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='watchdog0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </watchdog>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <memballoon model='virtio'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <stats period='10'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='balloon0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <rng model='virtio'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <alias name='rng0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <label>system_u:system_r:svirt_t:s0:c125,c442</label>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c125,c442</imagelabel>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <label>+107:+107</label>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </seclabel>
Jan 23 05:23:09 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:23:09 np0005593294 nova_compute[225705]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.296 225709 WARNING nova.virt.libvirt.driver [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Detaching interface fa:16:3e:02:7d:ea failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapd372b816-e4' not found.#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.297 225709 DEBUG nova.virt.libvirt.vif [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.298 225709 DEBUG nova.network.os_vif_util [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.299 225709 DEBUG nova.network.os_vif_util [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.299 225709 DEBUG os_vif [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.302 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.303 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd372b816-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.303 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.306 225709 INFO os_vif [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4')#033[00m
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.307 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:creationTime>2026-01-23 10:23:09</nova:creationTime>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:flavor name="m1.nano">
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:memory>128</nova:memory>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:disk>1</nova:disk>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:swap>0</nova:swap>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </nova:flavor>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:owner>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </nova:owner>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  <nova:ports>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 05:23:09 np0005593294 nova_compute[225705]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:    </nova:port>
Jan 23 05:23:09 np0005593294 nova_compute[225705]:  </nova:ports>
Jan 23 05:23:09 np0005593294 nova_compute[225705]: </nova:instance>
Jan 23 05:23:09 np0005593294 nova_compute[225705]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 05:23:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:23:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1488791737' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:09 np0005593294 nova_compute[225705]: 2026-01-23 10:23:09.760 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:10.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:10 np0005593294 nova_compute[225705]: 2026-01-23 10:23:10.662 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:23:10 np0005593294 nova_compute[225705]: 2026-01-23 10:23:10.662 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:23:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:10.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:10 np0005593294 nova_compute[225705]: 2026-01-23 10:23:10.928 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:23:10 np0005593294 nova_compute[225705]: 2026-01-23 10:23:10.929 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4732MB free_disk=59.942562103271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:23:10 np0005593294 nova_compute[225705]: 2026-01-23 10:23:10.930 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:10 np0005593294 nova_compute[225705]: 2026-01-23 10:23:10.930 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:10 np0005593294 nova_compute[225705]: 2026-01-23 10:23:10.993 225709 DEBUG nova.compute.manager [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:10 np0005593294 nova_compute[225705]: 2026-01-23 10:23:10.994 225709 DEBUG nova.compute.manager [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:23:10 np0005593294 nova_compute[225705]: 2026-01-23 10:23:10.994 225709 DEBUG oslo_concurrency.lockutils [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.034 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.035 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.035 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.094 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.095 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.096 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.096 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.097 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.099 225709 INFO nova.compute.manager [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Terminating instance#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.101 225709 DEBUG nova.compute.manager [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.138 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:11 np0005593294 kernel: tapdfaa68a5-31 (unregistering): left promiscuous mode
Jan 23 05:23:11 np0005593294 NetworkManager[48978]: <info>  [1769163791.1508] device (tapdfaa68a5-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:23:11 np0005593294 ovn_controller[133293]: 2026-01-23T10:23:11Z|00072|binding|INFO|Releasing lport dfaa68a5-31a2-4de5-996e-11936357ca9b from this chassis (sb_readonly=0)
Jan 23 05:23:11 np0005593294 ovn_controller[133293]: 2026-01-23T10:23:11Z|00073|binding|INFO|Setting lport dfaa68a5-31a2-4de5-996e-11936357ca9b down in Southbound
Jan 23 05:23:11 np0005593294 ovn_controller[133293]: 2026-01-23T10:23:11Z|00074|binding|INFO|Removing iface tapdfaa68a5-31 ovn-installed in OVS
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.163 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.178 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:90:a0 10.100.0.11'], port_security=['fa:16:3e:b7:90:a0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8afa87f8-5b22-4350-8bf2-c7af019c3372', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dbc1781-4648-4570-b3c6-0353674ab246, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=dfaa68a5-31a2-4de5-996e-11936357ca9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.179 143098 INFO neutron.agent.ovn.metadata.agent [-] Port dfaa68a5-31a2-4de5-996e-11936357ca9b in datapath eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 unbound from our chassis#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.180 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.181 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7cc74c-15e2-43b2-87ed-8dd2db394424]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.182 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 namespace which is not needed anymore#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.194 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593294 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 23 05:23:11 np0005593294 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 17.604s CPU time.
Jan 23 05:23:11 np0005593294 systemd-machined[194551]: Machine qemu-3-instance-00000006 terminated.
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.326 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593294 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [NOTICE]   (234054) : haproxy version is 2.8.14-c23fe91
Jan 23 05:23:11 np0005593294 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [NOTICE]   (234054) : path to executable is /usr/sbin/haproxy
Jan 23 05:23:11 np0005593294 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [ALERT]    (234054) : Current worker (234056) exited with code 143 (Terminated)
Jan 23 05:23:11 np0005593294 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [WARNING]  (234054) : All workers exited. Exiting... (0)
Jan 23 05:23:11 np0005593294 systemd[1]: libpod-77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee.scope: Deactivated successfully.
Jan 23 05:23:11 np0005593294 podman[234723]: 2026-01-23 10:23:11.334991779 +0000 UTC m=+0.058947496 container died 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.335 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.343 225709 INFO nova.virt.libvirt.driver [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance destroyed successfully.#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.344 225709 DEBUG nova.objects.instance [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:11 np0005593294 systemd[1]: var-lib-containers-storage-overlay-33d908cc39ee3158e23a819479dadd5bc8e191abccd206682925f5884fa34301-merged.mount: Deactivated successfully.
Jan 23 05:23:11 np0005593294 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee-userdata-shm.mount: Deactivated successfully.
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.379 225709 DEBUG nova.virt.libvirt.vif [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.379 225709 DEBUG nova.network.os_vif_util [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.380 225709 DEBUG nova.network.os_vif_util [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.380 225709 DEBUG os_vif [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.381 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.382 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdfaa68a5-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.385 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:23:11 np0005593294 podman[234723]: 2026-01-23 10:23:11.389797444 +0000 UTC m=+0.113753131 container cleanup 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.389 225709 INFO os_vif [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31')#033[00m
Jan 23 05:23:11 np0005593294 systemd[1]: libpod-conmon-77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee.scope: Deactivated successfully.
Jan 23 05:23:11 np0005593294 podman[234791]: 2026-01-23 10:23:11.465217792 +0000 UTC m=+0.051284397 container remove 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.474 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c04b5cc9-7b4d-4e29-95aa-cc6d2488671a]: (4, ('Fri Jan 23 10:23:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 (77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee)\n77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee\nFri Jan 23 10:23:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 (77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee)\n77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.476 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb4ec96-72c9-4ba9-aa4a-1197302c2002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.477 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeae9e618-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.479 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593294 kernel: tapeae9e618-a0: left promiscuous mode
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.499 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6bc248-e06a-429f-90d4-5cb6fe024cf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.517 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7b603179-c186-42b2-994e-4dfab042785f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.521 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[66627b4e-35f5-46d6-8920-28644d9053a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.540 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4ae76a-8b97-4534-958a-dfa9fe36b0e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485745, 'reachable_time': 31343, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234819, 'error': None, 'target': 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.544 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:23:11 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.544 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[45687743-82da-41ba-86e5-940de42c2c29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:11 np0005593294 systemd[1]: run-netns-ovnmeta\x2deae9e618\x2da7c2\x2d43e9\x2dab46\x2d9070ca2ef7f2.mount: Deactivated successfully.
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.613 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:23:11 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/721921800' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.644 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.652 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.688 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.689 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:23:11 np0005593294 nova_compute[225705]: 2026-01-23 10:23:11.689 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.069 225709 DEBUG nova.network.neutron [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.085 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.087 225709 DEBUG oslo_concurrency.lockutils [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.087 225709 DEBUG nova.network.neutron [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.111 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-d372b816-e400-40ea-9e6b-cc8c21e54bc6" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 9.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.286 225709 INFO nova.virt.libvirt.driver [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Deleting instance files /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2_del#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.287 225709 INFO nova.virt.libvirt.driver [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Deletion of /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2_del complete#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.331 225709 INFO nova.compute.manager [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Took 1.23 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.332 225709 DEBUG oslo.service.loopingcall [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.332 225709 DEBUG nova.compute.manager [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.332 225709 DEBUG nova.network.neutron [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:23:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:12.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102312 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:23:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.731 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.751 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.752 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:12 np0005593294 nova_compute[225705]: 2026-01-23 10:23:12.752 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:23:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:12.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.148 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-unplugged-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.149 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.149 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.149 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.150 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-unplugged-dfaa68a5-31a2-4de5-996e-11936357ca9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.150 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-unplugged-dfaa68a5-31a2-4de5-996e-11936357ca9b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.150 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.151 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.151 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.151 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.152 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.152 225709 WARNING nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:23:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.897 225709 DEBUG nova.network.neutron [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.924 225709 INFO nova.compute.manager [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Took 1.59 seconds to deallocate network for instance.#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.982 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.982 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:13 np0005593294 nova_compute[225705]: 2026-01-23 10:23:13.996 225709 DEBUG nova.compute.manager [req-c026a045-b632-4c6b-8d6c-ab509f3a3f72 req-299df4e3-51e4-47f1-98a4-29f28d2d21ee 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-deleted-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.036 225709 DEBUG oslo_concurrency.processutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.172 225709 DEBUG nova.network.neutron [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.173 225709 DEBUG nova.network.neutron [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.196 225709 DEBUG oslo_concurrency.lockutils [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:14.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:23:14 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1573239966' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.548 225709 DEBUG oslo_concurrency.processutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.556 225709 DEBUG nova.compute.provider_tree [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.579 225709 DEBUG nova.scheduler.client.report [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.602 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.628 225709 INFO nova.scheduler.client.report [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance db7b623e-73d8-45e2-b7eb-1a861bef62c2#033[00m
Jan 23 05:23:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:14 np0005593294 nova_compute[225705]: 2026-01-23 10:23:14.703 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:14.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:16.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:16 np0005593294 nova_compute[225705]: 2026-01-23 10:23:16.385 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:16 np0005593294 nova_compute[225705]: 2026-01-23 10:23:16.614 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:16.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:18.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:18.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:19 np0005593294 nova_compute[225705]: 2026-01-23 10:23:19.700 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:19 np0005593294 nova_compute[225705]: 2026-01-23 10:23:19.804 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:20.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:20.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:21 np0005593294 nova_compute[225705]: 2026-01-23 10:23:21.389 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:21 np0005593294 nova_compute[225705]: 2026-01-23 10:23:21.618 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:22.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:22 np0005593294 podman[234851]: 2026-01-23 10:23:22.640632205 +0000 UTC m=+0.048755578 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:23:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:22.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:24.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:24 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:24.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:25 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:26 np0005593294 nova_compute[225705]: 2026-01-23 10:23:26.341 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163791.3396697, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:23:26 np0005593294 nova_compute[225705]: 2026-01-23 10:23:26.341 225709 INFO nova.compute.manager [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:23:26 np0005593294 nova_compute[225705]: 2026-01-23 10:23:26.369 225709 DEBUG nova.compute.manager [None req-21a88963-318e-48c9-95f1-56c738fa502d - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:26.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:26 np0005593294 nova_compute[225705]: 2026-01-23 10:23:26.394 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:26 np0005593294 nova_compute[225705]: 2026-01-23 10:23:26.621 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:26 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:26.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:27 np0005593294 kernel: ganesha.nfsd[234383]: segfault at 50 ip 00007faf9a38932e sp 00007faf037fd210 error 4 in libntirpc.so.5.8[7faf9a36e000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 23 05:23:27 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:23:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy ignored for local
Jan 23 05:23:27 np0005593294 systemd[1]: Started Process Core Dump (PID 234898/UID 0).
Jan 23 05:23:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:28.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:28 np0005593294 systemd-coredump[234899]: Process 228650 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 79:#012#0  0x00007faf9a38932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 05:23:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:28 np0005593294 systemd[1]: systemd-coredump@10-234898-0.service: Deactivated successfully.
Jan 23 05:23:28 np0005593294 systemd[1]: systemd-coredump@10-234898-0.service: Consumed 1.217s CPU time.
Jan 23 05:23:28 np0005593294 podman[234905]: 2026-01-23 10:23:28.756768728 +0000 UTC m=+0.031872833 container died 0b872f0e8bc76b7d6eee9eacb5ff8971176a2ec887fbe703344a6669530bce26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:23:28 np0005593294 systemd[1]: var-lib-containers-storage-overlay-bed27c64872bf8529345969a4191e92df6ccf355f83eb481cb6d4f93143e1094-merged.mount: Deactivated successfully.
Jan 23 05:23:28 np0005593294 podman[234905]: 2026-01-23 10:23:28.80085709 +0000 UTC m=+0.075961125 container remove 0b872f0e8bc76b7d6eee9eacb5ff8971176a2ec887fbe703344a6669530bce26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:23:28 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:23:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:28.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:29 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 05:23:29 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.409s CPU time.
Jan 23 05:23:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:30.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:30 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:30.517 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:30 np0005593294 nova_compute[225705]: 2026-01-23 10:23:30.518 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:30 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:30.518 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:23:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:30.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:31 np0005593294 nova_compute[225705]: 2026-01-23 10:23:31.397 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:31 np0005593294 nova_compute[225705]: 2026-01-23 10:23:31.623 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:32.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:32.521 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:32.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:33 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102333 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:23:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:34.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:34.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:36 np0005593294 nova_compute[225705]: 2026-01-23 10:23:36.400 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:36.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:36 np0005593294 nova_compute[225705]: 2026-01-23 10:23:36.625 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:36.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:38.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:38.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:39 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 11.
Jan 23 05:23:39 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:23:39 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.409s CPU time.
Jan 23 05:23:39 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:23:39 np0005593294 podman[234954]: 2026-01-23 10:23:39.228403093 +0000 UTC m=+0.125624192 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 23 05:23:39 np0005593294 podman[235025]: 2026-01-23 10:23:39.383767569 +0000 UTC m=+0.072312231 container create 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 05:23:39 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:23:39 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:23:39 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:23:39 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:23:39 np0005593294 podman[235025]: 2026-01-23 10:23:39.356217842 +0000 UTC m=+0.044762554 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:23:39 np0005593294 podman[235025]: 2026-01-23 10:23:39.454181492 +0000 UTC m=+0.142726224 container init 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:23:39 np0005593294 podman[235025]: 2026-01-23 10:23:39.459878289 +0000 UTC m=+0.148422931 container start 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Jan 23 05:23:39 np0005593294 bash[235025]: 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a
Jan 23 05:23:39 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:23:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:23:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:23:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:23:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:23:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:23:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:23:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:23:39 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:23:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:40.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:40.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:41 np0005593294 nova_compute[225705]: 2026-01-23 10:23:41.405 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:41 np0005593294 nova_compute[225705]: 2026-01-23 10:23:41.626 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.869178) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821869209, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1625, "num_deletes": 505, "total_data_size": 3294967, "memory_usage": 3352472, "flush_reason": "Manual Compaction"}
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821880260, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1383209, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28237, "largest_seqno": 29857, "table_properties": {"data_size": 1377972, "index_size": 2057, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16001, "raw_average_key_size": 19, "raw_value_size": 1364717, "raw_average_value_size": 1630, "num_data_blocks": 90, "num_entries": 837, "num_filter_entries": 837, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163706, "oldest_key_time": 1769163706, "file_creation_time": 1769163821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 11121 microseconds, and 4914 cpu microseconds.
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.880298) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1383209 bytes OK
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.880316) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.882226) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.882239) EVENT_LOG_v1 {"time_micros": 1769163821882235, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.882257) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3286579, prev total WAL file size 3286579, number of live WAL files 2.
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.883095) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353130' seq:72057594037927935, type:22 .. '6C6F676D00373631' seq:0, type:0; will stop at (end)
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1350KB)], [54(14MB)]
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821883171, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16235616, "oldest_snapshot_seqno": -1}
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5784 keys, 12761688 bytes, temperature: kUnknown
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821994316, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 12761688, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12724317, "index_size": 21837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14469, "raw_key_size": 149288, "raw_average_key_size": 25, "raw_value_size": 12620745, "raw_average_value_size": 2182, "num_data_blocks": 877, "num_entries": 5784, "num_filter_entries": 5784, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.994751) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 12761688 bytes
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.997001) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 145.9 rd, 114.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 14.2 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(21.0) write-amplify(9.2) OK, records in: 6762, records dropped: 978 output_compression: NoCompression
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.997034) EVENT_LOG_v1 {"time_micros": 1769163821997019, "job": 32, "event": "compaction_finished", "compaction_time_micros": 111249, "compaction_time_cpu_micros": 53481, "output_level": 6, "num_output_files": 1, "total_output_size": 12761688, "num_input_records": 6762, "num_output_records": 5784, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:41 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821997701, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 23 05:23:42 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:42 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163822002614, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 23 05:23:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.882992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002706) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002720) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:42 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:42.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:42.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:43 np0005593294 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 05:23:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:44.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:45 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:23:45 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:45 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:23:46 np0005593294 nova_compute[225705]: 2026-01-23 10:23:46.408 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:46.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:46 np0005593294 nova_compute[225705]: 2026-01-23 10:23:46.629 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:46.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:48.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:48.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:50.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:50.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:51 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:23:51 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:23:51 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:23:51 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:23:51 np0005593294 nova_compute[225705]: 2026-01-23 10:23:51.413 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:23:51 np0005593294 nova_compute[225705]: 2026-01-23 10:23:51.631 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:23:51 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:23:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:52.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:52 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9d4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:52 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:52.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:53 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:53 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:53 np0005593294 podman[235215]: 2026-01-23 10:23:53.683998446 +0000 UTC m=+0.082676634 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:23:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:54.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:54 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:54 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:54 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:54.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:55.053 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:55.054 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:23:55.054 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102355 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:23:55 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:55 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:55 np0005593294 ovn_controller[133293]: 2026-01-23T10:23:55Z|00075|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 23 05:23:55 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:23:55 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:23:56 np0005593294 nova_compute[225705]: 2026-01-23 10:23:56.418 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:56.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:56 np0005593294 nova_compute[225705]: 2026-01-23 10:23:56.677 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:56 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:56 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:56 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:56.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:57 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:58.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:58 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:58 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:58 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:23:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:58.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:59 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:59 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 05:24:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:00.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 05:24:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:00 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:00 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:00 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:00.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:01 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:01 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:01 np0005593294 nova_compute[225705]: 2026-01-23 10:24:01.422 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:01 np0005593294 nova_compute[225705]: 2026-01-23 10:24:01.679 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:02.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:02 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:02 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:02.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:03 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:03 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:03 np0005593294 nova_compute[225705]: 2026-01-23 10:24:03.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:03 np0005593294 nova_compute[225705]: 2026-01-23 10:24:03.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:24:03 np0005593294 nova_compute[225705]: 2026-01-23 10:24:03.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:24:03 np0005593294 nova_compute[225705]: 2026-01-23 10:24:03.894 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:24:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:04.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:04 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:04 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:04 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:04.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:05 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:05 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:05 np0005593294 nova_compute[225705]: 2026-01-23 10:24:05.887 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:06 np0005593294 nova_compute[225705]: 2026-01-23 10:24:06.426 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:06.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:06 np0005593294 nova_compute[225705]: 2026-01-23 10:24:06.680 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:06 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:06 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:06 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:06 np0005593294 nova_compute[225705]: 2026-01-23 10:24:06.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:06.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:07 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:07 np0005593294 nova_compute[225705]: 2026-01-23 10:24:07.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:07 np0005593294 nova_compute[225705]: 2026-01-23 10:24:07.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.108 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.109 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.109 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.109 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.109 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:08.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:24:08 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1478679152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.615 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:08 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:08 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:08 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.826 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.827 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4908MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.828 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.828 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.930 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.931 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:24:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:08.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.949 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.985 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.985 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:24:08 np0005593294 nova_compute[225705]: 2026-01-23 10:24:08.999 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:24:09 np0005593294 nova_compute[225705]: 2026-01-23 10:24:09.029 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:24:09 np0005593294 nova_compute[225705]: 2026-01-23 10:24:09.050 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:09 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:09 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:24:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3295464724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:09 np0005593294 nova_compute[225705]: 2026-01-23 10:24:09.540 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:09 np0005593294 nova_compute[225705]: 2026-01-23 10:24:09.548 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:24:09 np0005593294 nova_compute[225705]: 2026-01-23 10:24:09.563 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:24:09 np0005593294 nova_compute[225705]: 2026-01-23 10:24:09.602 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:24:09 np0005593294 nova_compute[225705]: 2026-01-23 10:24:09.602 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:09 np0005593294 podman[235337]: 2026-01-23 10:24:09.698075723 +0000 UTC m=+0.100723456 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 05:24:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:10.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:10 np0005593294 nova_compute[225705]: 2026-01-23 10:24:10.602 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:10 np0005593294 nova_compute[225705]: 2026-01-23 10:24:10.603 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:10 np0005593294 nova_compute[225705]: 2026-01-23 10:24:10.605 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:10 np0005593294 nova_compute[225705]: 2026-01-23 10:24:10.605 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:24:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:10 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:10 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:10 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:10.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:11 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:11 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:11 np0005593294 nova_compute[225705]: 2026-01-23 10:24:11.429 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:11 np0005593294 nova_compute[225705]: 2026-01-23 10:24:11.682 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:12.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:12 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:12 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:12 np0005593294 nova_compute[225705]: 2026-01-23 10:24:12.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:12.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:13 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:13 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:14.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:14 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:14 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:14 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:14.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:15 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:15 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:16 np0005593294 nova_compute[225705]: 2026-01-23 10:24:16.433 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:16.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:16 np0005593294 nova_compute[225705]: 2026-01-23 10:24:16.684 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102416 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:24:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:16 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:16 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:16 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:16.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:17 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:18.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:18 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:18 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:18 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:18.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:19 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:19 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:20.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:20 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:20 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:20 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:20.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:21 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:21 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9d4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:21 np0005593294 nova_compute[225705]: 2026-01-23 10:24:21.437 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:21 np0005593294 nova_compute[225705]: 2026-01-23 10:24:21.686 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:22.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:22 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:22 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:22.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:23 np0005593294 kernel: ganesha.nfsd[235200]: segfault at 50 ip 00007fba5ddef32e sp 00007fb9f1ffa210 error 4 in libntirpc.so.5.8[7fba5ddd4000+2c000] likely on CPU 7 (core 0, socket 7)
Jan 23 05:24:23 np0005593294 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:24:23 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:23 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc003cc0 fd 38 proxy ignored for local
Jan 23 05:24:23 np0005593294 systemd[1]: Started Process Core Dump (PID 235370/UID 0).
Jan 23 05:24:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:24.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:24 np0005593294 systemd-coredump[235372]: Process 235046 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 41:#012#0  0x00007fba5ddef32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 05:24:24 np0005593294 podman[235373]: 2026-01-23 10:24:24.652199517 +0000 UTC m=+0.052465384 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 05:24:24 np0005593294 systemd[1]: systemd-coredump@11-235370-0.service: Deactivated successfully.
Jan 23 05:24:24 np0005593294 systemd[1]: systemd-coredump@11-235370-0.service: Consumed 1.131s CPU time.
Jan 23 05:24:24 np0005593294 podman[235396]: 2026-01-23 10:24:24.74416399 +0000 UTC m=+0.020296313 container died 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 05:24:24 np0005593294 systemd[1]: var-lib-containers-storage-overlay-c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373-merged.mount: Deactivated successfully.
Jan 23 05:24:24 np0005593294 podman[235396]: 2026-01-23 10:24:24.792941119 +0000 UTC m=+0.069073442 container remove 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 05:24:24 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:24:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:24.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:24 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 05:24:24 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.442s CPU time.
Jan 23 05:24:26 np0005593294 nova_compute[225705]: 2026-01-23 10:24:26.444 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:26.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:26 np0005593294 nova_compute[225705]: 2026-01-23 10:24:26.687 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:26.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:28.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:28.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:29 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102429 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:24:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:30.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:30 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:24:30.923 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:30 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:24:30.923 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:24:30 np0005593294 nova_compute[225705]: 2026-01-23 10:24:30.924 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:30.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:31 np0005593294 nova_compute[225705]: 2026-01-23 10:24:31.449 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:31 np0005593294 nova_compute[225705]: 2026-01-23 10:24:31.689 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:32.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:32.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:34.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:34.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:35 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 12.
Jan 23 05:24:35 np0005593294 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:24:35 np0005593294 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.442s CPU time.
Jan 23 05:24:35 np0005593294 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:24:35 np0005593294 podman[235518]: 2026-01-23 10:24:35.424372347 +0000 UTC m=+0.050381889 container create 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 05:24:35 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d33a4460d3a4a3015dba4f8c11a57ca82a738fcb8cff5728fc7ff64591543afb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:24:35 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d33a4460d3a4a3015dba4f8c11a57ca82a738fcb8cff5728fc7ff64591543afb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:24:35 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d33a4460d3a4a3015dba4f8c11a57ca82a738fcb8cff5728fc7ff64591543afb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:24:35 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d33a4460d3a4a3015dba4f8c11a57ca82a738fcb8cff5728fc7ff64591543afb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:24:35 np0005593294 podman[235518]: 2026-01-23 10:24:35.494949525 +0000 UTC m=+0.120959067 container init 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:24:35 np0005593294 podman[235518]: 2026-01-23 10:24:35.404948873 +0000 UTC m=+0.030958395 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:24:35 np0005593294 podman[235518]: 2026-01-23 10:24:35.504065719 +0000 UTC m=+0.130075221 container start 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default)
Jan 23 05:24:35 np0005593294 bash[235518]: 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6
Jan 23 05:24:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:24:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:24:35 np0005593294 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:24:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:24:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:24:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:24:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:24:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:24:35 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:24:36 np0005593294 nova_compute[225705]: 2026-01-23 10:24:36.453 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:36.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:36 np0005593294 nova_compute[225705]: 2026-01-23 10:24:36.691 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:36.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:38.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:38 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:24:38.926 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:38.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:40.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:40 np0005593294 podman[235579]: 2026-01-23 10:24:40.691985664 +0000 UTC m=+0.093576404 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:24:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:41.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:41 np0005593294 nova_compute[225705]: 2026-01-23 10:24:41.456 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:41 np0005593294 nova_compute[225705]: 2026-01-23 10:24:41.693 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:24:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:24:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:24:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:42.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:43.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:44.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:45.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:46 np0005593294 nova_compute[225705]: 2026-01-23 10:24:46.460 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:46.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:46 np0005593294 nova_compute[225705]: 2026-01-23 10:24:46.695 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:24:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:24:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:24:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:24:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:47.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:48.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:49.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:50.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:51.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:51 np0005593294 nova_compute[225705]: 2026-01-23 10:24:51.492 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:51 np0005593294 nova_compute[225705]: 2026-01-23 10:24:51.697 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:24:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:24:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:24:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:24:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:52.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:53.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:54.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:55.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:24:55.055 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:24:55.055 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:24:55.056 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:55 np0005593294 podman[235658]: 2026-01-23 10:24:55.704958258 +0000 UTC m=+0.087955020 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 05:24:55 np0005593294 nova_compute[225705]: 2026-01-23 10:24:55.733 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:55 np0005593294 nova_compute[225705]: 2026-01-23 10:24:55.733 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:55 np0005593294 nova_compute[225705]: 2026-01-23 10:24:55.817 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:24:55 np0005593294 nova_compute[225705]: 2026-01-23 10:24:55.903 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:55 np0005593294 nova_compute[225705]: 2026-01-23 10:24:55.903 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:55 np0005593294 nova_compute[225705]: 2026-01-23 10:24:55.911 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:24:55 np0005593294 nova_compute[225705]: 2026-01-23 10:24:55.911 225709 INFO nova.compute.claims [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.014 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:56 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:24:56 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1853192220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.499 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.507 225709 DEBUG nova.compute.provider_tree [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.524 225709 DEBUG nova.scheduler.client.report [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:24:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:56.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.547 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.548 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.664 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.664 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.698 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.814 225709 DEBUG nova.policy [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.838 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:24:56 np0005593294 nova_compute[225705]: 2026-01-23 10:24:56.936 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:24:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:24:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:24:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:24:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:24:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:57.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.094 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.097 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.097 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Creating image(s)#033[00m
Jan 23 05:24:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:24:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:57 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.143 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.175 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.213 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.217 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.312 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.314 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.315 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.315 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.354 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.358 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.678 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.784 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.921 225709 DEBUG nova.objects.instance [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid ee5670f1-f0fa-4c86-855a-ce14c49091ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.937 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.937 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Ensure instance console log exists: /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.938 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.939 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.939 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:57 np0005593294 nova_compute[225705]: 2026-01-23 10:24:57.945 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Successfully created port: b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:24:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:58.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:58 np0005593294 nova_compute[225705]: 2026-01-23 10:24:58.916 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Successfully updated port: b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:24:58 np0005593294 nova_compute[225705]: 2026-01-23 10:24:58.941 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:58 np0005593294 nova_compute[225705]: 2026-01-23 10:24:58.941 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:58 np0005593294 nova_compute[225705]: 2026-01-23 10:24:58.941 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:24:59 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:59 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:24:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:59.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:59 np0005593294 nova_compute[225705]: 2026-01-23 10:24:59.063 225709 DEBUG nova.compute.manager [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:59 np0005593294 nova_compute[225705]: 2026-01-23 10:24:59.064 225709 DEBUG nova.compute.manager [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing instance network info cache due to event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:24:59 np0005593294 nova_compute[225705]: 2026-01-23 10:24:59.064 225709 DEBUG oslo_concurrency.lockutils [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:59 np0005593294 nova_compute[225705]: 2026-01-23 10:24:59.213 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.041 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:00 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:00 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.391 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.391 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance network_info: |[{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.392 225709 DEBUG oslo_concurrency.lockutils [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.393 225709 DEBUG nova.network.neutron [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.398 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start _get_guest_xml network_info=[{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encrypted': False, 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encryption_format': None, 'boot_index': 0, 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.405 225709 WARNING nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.410 225709 DEBUG nova.virt.libvirt.host [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.411 225709 DEBUG nova.virt.libvirt.host [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.416 225709 DEBUG nova.virt.libvirt.host [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.416 225709 DEBUG nova.virt.libvirt.host [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.417 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.418 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.418 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.419 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.419 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.420 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.420 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.421 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.421 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.422 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.422 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.423 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.428 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:00.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:00 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:25:00 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/207192348' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.947 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:00 np0005593294 nova_compute[225705]: 2026-01-23 10:25:00.997 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.004 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:01.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:25:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:25:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.496 225709 DEBUG nova.network.neutron [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updated VIF entry in instance network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.497 225709 DEBUG nova.network.neutron [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:01 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:25:01 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2995660539' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.501 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.515 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.517 225709 DEBUG nova.virt.libvirt.vif [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:24:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-201401353',display_name='tempest-TestNetworkBasicOps-server-201401353',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-201401353',id=10,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSiMVclS29CQGuWDzR4wqcRxghXwRX3OnxYcchVIhN1re6S5JbcUZIdPe1ViONpYjthNnTE0ukmKTamuv4VEW3D7ha0cmAwvhq7SF9xxubWvSPNpPeahMeeSWlQXgBMKw==',key_name='tempest-TestNetworkBasicOps-247222292',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-bwgfnixj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:24:57Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ee5670f1-f0fa-4c86-855a-ce14c49091ec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.517 225709 DEBUG nova.network.os_vif_util [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.518 225709 DEBUG nova.network.os_vif_util [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.519 225709 DEBUG nova.objects.instance [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid ee5670f1-f0fa-4c86-855a-ce14c49091ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.701 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.871 225709 DEBUG oslo_concurrency.lockutils [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.874 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <uuid>ee5670f1-f0fa-4c86-855a-ce14c49091ec</uuid>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <name>instance-0000000a</name>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <memory>131072</memory>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <vcpu>1</vcpu>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <nova:name>tempest-TestNetworkBasicOps-server-201401353</nova:name>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <nova:creationTime>2026-01-23 10:25:00</nova:creationTime>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <nova:flavor name="m1.nano">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <nova:memory>128</nova:memory>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <nova:disk>1</nova:disk>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <nova:swap>0</nova:swap>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      </nova:flavor>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <nova:owner>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      </nova:owner>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <nova:ports>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <nova:port uuid="b35832ce-bd22-4306-81e2-4d6c9cc4fb5e">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        </nova:port>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      </nova:ports>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    </nova:instance>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <sysinfo type="smbios">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <entry name="serial">ee5670f1-f0fa-4c86-855a-ce14c49091ec</entry>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <entry name="uuid">ee5670f1-f0fa-4c86-855a-ce14c49091ec</entry>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <boot dev="hd"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <smbios mode="sysinfo"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <vmcoreinfo/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <clock offset="utc">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <timer name="hpet" present="no"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <cpu mode="host-model" match="exact">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <disk type="network" device="disk">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <driver type="raw" cache="none"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <source protocol="rbd" name="vms/ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <auth username="openstack">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <target dev="vda" bus="virtio"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <disk type="network" device="cdrom">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <driver type="raw" cache="none"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <source protocol="rbd" name="vms/ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <auth username="openstack">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <target dev="sda" bus="sata"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <interface type="ethernet">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <mac address="fa:16:3e:52:5a:1e"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <model type="virtio"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <mtu size="1442"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <target dev="tapb35832ce-bd"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <serial type="pty">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <log file="/var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/console.log" append="off"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <model type="virtio"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <input type="tablet" bus="usb"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <rng model="virtio">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <controller type="usb" index="0"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    <memballoon model="virtio">
Jan 23 05:25:01 np0005593294 nova_compute[225705]:      <stats period="10"/>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:25:01 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:25:01 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:25:01 np0005593294 nova_compute[225705]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.876 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Preparing to wait for external event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.876 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.876 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.877 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.878 225709 DEBUG nova.virt.libvirt.vif [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:24:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-201401353',display_name='tempest-TestNetworkBasicOps-server-201401353',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-201401353',id=10,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSiMVclS29CQGuWDzR4wqcRxghXwRX3OnxYcchVIhN1re6S5JbcUZIdPe1ViONpYjthNnTE0ukmKTamuv4VEW3D7ha0cmAwvhq7SF9xxubWvSPNpPeahMeeSWlQXgBMKw==',key_name='tempest-TestNetworkBasicOps-247222292',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-bwgfnixj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:24:57Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ee5670f1-f0fa-4c86-855a-ce14c49091ec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.878 225709 DEBUG nova.network.os_vif_util [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.879 225709 DEBUG nova.network.os_vif_util [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.879 225709 DEBUG os_vif [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.880 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.880 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.881 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.885 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.885 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb35832ce-bd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.885 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb35832ce-bd, col_values=(('external_ids', {'iface-id': 'b35832ce-bd22-4306-81e2-4d6c9cc4fb5e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:5a:1e', 'vm-uuid': 'ee5670f1-f0fa-4c86-855a-ce14c49091ec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:01 np0005593294 NetworkManager[48978]: <info>  [1769163901.9141] manager: (tapb35832ce-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.913 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.917 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.925 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:01 np0005593294 nova_compute[225705]: 2026-01-23 10:25:01.926 225709 INFO os_vif [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd')#033[00m
Jan 23 05:25:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:02 np0005593294 ceph-mon[80126]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.384 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.385 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.385 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:52:5a:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.386 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Using config drive#033[00m
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.425 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:02.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.790 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Creating config drive at /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config#033[00m
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.799 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx0qe8u6x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.933 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx0qe8u6x" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.975 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:02 np0005593294 nova_compute[225705]: 2026-01-23 10:25:02.979 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:03.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.534 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.536 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Deleting local config drive /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config because it was imported into RBD.#033[00m
Jan 23 05:25:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:04.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:04 np0005593294 systemd[1]: Starting libvirt secret daemon...
Jan 23 05:25:04 np0005593294 systemd[1]: Started libvirt secret daemon.
Jan 23 05:25:04 np0005593294 kernel: tapb35832ce-bd: entered promiscuous mode
Jan 23 05:25:04 np0005593294 NetworkManager[48978]: <info>  [1769163904.6639] manager: (tapb35832ce-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.665 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:04Z|00076|binding|INFO|Claiming lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e for this chassis.
Jan 23 05:25:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:04Z|00077|binding|INFO|b35832ce-bd22-4306-81e2-4d6c9cc4fb5e: Claiming fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.669 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.676 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:04 np0005593294 systemd-machined[194551]: New machine qemu-4-instance-0000000a.
Jan 23 05:25:04 np0005593294 systemd-udevd[236158]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:25:04 np0005593294 systemd[1]: Started Virtual Machine qemu-4-instance-0000000a.
Jan 23 05:25:04 np0005593294 NetworkManager[48978]: <info>  [1769163904.7291] device (tapb35832ce-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:25:04 np0005593294 NetworkManager[48978]: <info>  [1769163904.7307] device (tapb35832ce-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.744 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:04Z|00078|binding|INFO|Setting lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e ovn-installed in OVS
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.751 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.861 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:5a:1e 10.100.0.7'], port_security=['fa:16:3e:52:5a:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ee5670f1-f0fa-4c86-855a-ce14c49091ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7e7149f7-1f80-4cb0-a07a-4ad2ce209150', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79ce8e9f-5595-435d-b3c2-9a811b1982a6, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.862 143098 INFO neutron.agent.ovn.metadata.agent [-] Port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e in datapath a36237e8-b709-4a50-8f8b-9cccdf12f329 bound to our chassis#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.864 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a36237e8-b709-4a50-8f8b-9cccdf12f329#033[00m
Jan 23 05:25:04 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:04Z|00079|binding|INFO|Setting lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e up in Southbound
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.879 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1db16338-ab18-4957-ae9e-93716c4f5d30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.880 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa36237e8-b1 in ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.882 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa36237e8-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.883 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[90181eed-e79e-406a-b5c8-83373724f982]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.884 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[459b8a71-f361-4e2b-a658-7b1b9006dbba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.897 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:25:04 np0005593294 nova_compute[225705]: 2026-01-23 10:25:04.898 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.905 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[758c2a9e-5765-4a8e-9cdf-a4ed31264132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.935 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf7f991-165d-4915-9cb1-b1e535fe235c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.978 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c879266d-c140-44e5-b77d-f538d7bfbfb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:04 np0005593294 NetworkManager[48978]: <info>  [1769163904.9871] manager: (tapa36237e8-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Jan 23 05:25:04 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.988 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[17ddea73-6656-4000-bcc4-ff3facaa1d03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:04 np0005593294 systemd-udevd[236160]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.028 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[86f7d7b4-d051-4079-a7e4-614e37c1151e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.032 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[dc50a87f-144b-468f-a86d-504cdd8cf88b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:05.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:05 np0005593294 NetworkManager[48978]: <info>  [1769163905.0636] device (tapa36237e8-b0): carrier: link connected
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.073 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[33bafcb1-52a9-461b-8e33-623de263b84c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.097 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7778783e-da58-4e97-b397-fd8deb3028e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa36237e8-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:d6:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505904, 'reachable_time': 41458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236216, 'error': None, 'target': 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.114 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[40572e1b-2a47-47e6-a8cf-f810a80719f1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:d64c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505904, 'tstamp': 505904}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236226, 'error': None, 'target': 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.135 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f9098b-6dfd-4028-badf-be0b97414586]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa36237e8-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:d6:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505904, 'reachable_time': 41458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236229, 'error': None, 'target': 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.170 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0083f40e-46d2-4422-a3c5-32cfbb24a25f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.252 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[08a1fc8e-959e-47d6-958e-56f9f1781e29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.254 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa36237e8-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.254 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.254 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa36237e8-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:05 np0005593294 nova_compute[225705]: 2026-01-23 10:25:05.256 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:05 np0005593294 NetworkManager[48978]: <info>  [1769163905.2579] manager: (tapa36237e8-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 23 05:25:05 np0005593294 kernel: tapa36237e8-b0: entered promiscuous mode
Jan 23 05:25:05 np0005593294 nova_compute[225705]: 2026-01-23 10:25:05.259 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.260 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa36237e8-b0, col_values=(('external_ids', {'iface-id': 'fa751ede-fa2b-4950-a999-549fdae5ffae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:05 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:05Z|00080|binding|INFO|Releasing lport fa751ede-fa2b-4950-a999-549fdae5ffae from this chassis (sb_readonly=0)
Jan 23 05:25:05 np0005593294 nova_compute[225705]: 2026-01-23 10:25:05.261 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:05 np0005593294 nova_compute[225705]: 2026-01-23 10:25:05.287 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.288 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a36237e8-b709-4a50-8f8b-9cccdf12f329.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a36237e8-b709-4a50-8f8b-9cccdf12f329.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.288 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0ce9b3ec-cece-40fa-b02a-e07ed9d1e938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.289 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: global
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    log         /dev/log local0 debug
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    log-tag     haproxy-metadata-proxy-a36237e8-b709-4a50-8f8b-9cccdf12f329
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    user        root
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    group       root
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    maxconn     1024
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    pidfile     /var/lib/neutron/external/pids/a36237e8-b709-4a50-8f8b-9cccdf12f329.pid.haproxy
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    daemon
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: defaults
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    log global
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    mode http
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    option httplog
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    option dontlognull
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    option http-server-close
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    option forwardfor
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    retries                 3
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    timeout http-request    30s
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    timeout connect         30s
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    timeout client          32s
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    timeout server          32s
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    timeout http-keep-alive 30s
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: listen listener
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    bind 169.254.169.254:80
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]:    http-request add-header X-OVN-Network-ID a36237e8-b709-4a50-8f8b-9cccdf12f329
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:25:05 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.290 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'env', 'PROCESS_TAG=haproxy-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a36237e8-b709-4a50-8f8b-9cccdf12f329.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:25:05 np0005593294 podman[236287]: 2026-01-23 10:25:05.775876319 +0000 UTC m=+0.080810757 container create b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:25:05 np0005593294 systemd[1]: Started libpod-conmon-b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d.scope.
Jan 23 05:25:05 np0005593294 podman[236287]: 2026-01-23 10:25:05.733940723 +0000 UTC m=+0.038875221 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:25:05 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:25:05 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60ee88f003832c2d12671caf45936712538fa9185a4ea2311104f1ecd1bc220b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:25:05 np0005593294 podman[236287]: 2026-01-23 10:25:05.8838685 +0000 UTC m=+0.188802988 container init b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 05:25:05 np0005593294 podman[236287]: 2026-01-23 10:25:05.893340205 +0000 UTC m=+0.198274633 container start b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:25:05 np0005593294 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [NOTICE]   (236306) : New worker (236308) forked
Jan 23 05:25:05 np0005593294 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [NOTICE]   (236306) : Loading success.
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.415 225709 DEBUG nova.compute.manager [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.415 225709 DEBUG oslo_concurrency.lockutils [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.416 225709 DEBUG oslo_concurrency.lockutils [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.416 225709 DEBUG oslo_concurrency.lockutils [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.416 225709 DEBUG nova.compute.manager [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Processing event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:25:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:06.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.616 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.617 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163906.6154583, ee5670f1-f0fa-4c86-855a-ce14c49091ec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.618 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] VM Started (Lifecycle Event)#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.624 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.628 225709 INFO nova.virt.libvirt.driver [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance spawned successfully.#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.629 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.636 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.646 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.654 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.654 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.655 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.656 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.657 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.658 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.666 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.668 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163906.6157992, ee5670f1-f0fa-4c86-855a-ce14c49091ec => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.669 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.705 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.751 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.757 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163906.6229315, ee5670f1-f0fa-4c86-855a-ce14c49091ec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.757 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.781 225709 INFO nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Took 9.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.781 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.783 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.795 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.892 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:06 np0005593294 nova_compute[225705]: 2026-01-23 10:25:06.913 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:07 np0005593294 nova_compute[225705]: 2026-01-23 10:25:07.047 225709 INFO nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Took 11.18 seconds to build instance.#033[00m
Jan 23 05:25:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:07.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:07 np0005593294 nova_compute[225705]: 2026-01-23 10:25:07.725 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:07 np0005593294 nova_compute[225705]: 2026-01-23 10:25:07.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:07 np0005593294 nova_compute[225705]: 2026-01-23 10:25:07.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:07 np0005593294 nova_compute[225705]: 2026-01-23 10:25:07.895 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:07 np0005593294 nova_compute[225705]: 2026-01-23 10:25:07.895 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:07 np0005593294 nova_compute[225705]: 2026-01-23 10:25:07.896 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:07 np0005593294 nova_compute[225705]: 2026-01-23 10:25:07.896 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:25:07 np0005593294 nova_compute[225705]: 2026-01-23 10:25:07.896 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:25:08 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3677633277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.413 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.491 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.492 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.511 225709 DEBUG nova.compute.manager [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.512 225709 DEBUG oslo_concurrency.lockutils [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.513 225709 DEBUG oslo_concurrency.lockutils [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.513 225709 DEBUG oslo_concurrency.lockutils [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.514 225709 DEBUG nova.compute.manager [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] No waiting events found dispatching network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.515 225709 WARNING nova.compute.manager [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received unexpected event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e for instance with vm_state active and task_state None.#033[00m
Jan 23 05:25:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:08.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.741 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.744 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4696MB free_disk=59.967525482177734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.745 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.746 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.952 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance ee5670f1-f0fa-4c86-855a-ce14c49091ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.953 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.953 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:25:08 np0005593294 nova_compute[225705]: 2026-01-23 10:25:08.992 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:09.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:25:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3681678963' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:09 np0005593294 nova_compute[225705]: 2026-01-23 10:25:09.488 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:09 np0005593294 nova_compute[225705]: 2026-01-23 10:25:09.494 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:09 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:09 np0005593294 nova_compute[225705]: 2026-01-23 10:25:09.676 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:09 np0005593294 nova_compute[225705]: 2026-01-23 10:25:09.721 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:25:09 np0005593294 nova_compute[225705]: 2026-01-23 10:25:09.722 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:10.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:10 np0005593294 nova_compute[225705]: 2026-01-23 10:25:10.723 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:10 np0005593294 nova_compute[225705]: 2026-01-23 10:25:10.725 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:10 np0005593294 nova_compute[225705]: 2026-01-23 10:25:10.725 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:25:10 np0005593294 nova_compute[225705]: 2026-01-23 10:25:10.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:10 np0005593294 nova_compute[225705]: 2026-01-23 10:25:10.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:10 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:10 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:11.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:11 np0005593294 nova_compute[225705]: 2026-01-23 10:25:11.707 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:11 np0005593294 podman[236396]: 2026-01-23 10:25:11.755905763 +0000 UTC m=+0.145010715 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:25:11 np0005593294 nova_compute[225705]: 2026-01-23 10:25:11.916 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:11 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:11Z|00081|binding|INFO|Releasing lport fa751ede-fa2b-4950-a999-549fdae5ffae from this chassis (sb_readonly=0)
Jan 23 05:25:11 np0005593294 NetworkManager[48978]: <info>  [1769163911.9557] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 23 05:25:11 np0005593294 NetworkManager[48978]: <info>  [1769163911.9567] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 23 05:25:11 np0005593294 nova_compute[225705]: 2026-01-23 10:25:11.957 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:11 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:11Z|00082|binding|INFO|Releasing lport fa751ede-fa2b-4950-a999-549fdae5ffae from this chassis (sb_readonly=0)
Jan 23 05:25:11 np0005593294 nova_compute[225705]: 2026-01-23 10:25:11.989 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:11 np0005593294 nova_compute[225705]: 2026-01-23 10:25:11.998 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:12 np0005593294 nova_compute[225705]: 2026-01-23 10:25:12.216 225709 DEBUG nova.compute.manager [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:12 np0005593294 nova_compute[225705]: 2026-01-23 10:25:12.217 225709 DEBUG nova.compute.manager [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing instance network info cache due to event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:25:12 np0005593294 nova_compute[225705]: 2026-01-23 10:25:12.218 225709 DEBUG oslo_concurrency.lockutils [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:12 np0005593294 nova_compute[225705]: 2026-01-23 10:25:12.219 225709 DEBUG oslo_concurrency.lockutils [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:12 np0005593294 nova_compute[225705]: 2026-01-23 10:25:12.219 225709 DEBUG nova.network.neutron [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:25:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:12.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:12 np0005593294 nova_compute[225705]: 2026-01-23 10:25:12.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.017589) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913017690, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1267, "num_deletes": 251, "total_data_size": 3164328, "memory_usage": 3226864, "flush_reason": "Manual Compaction"}
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913032436, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2012832, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29862, "largest_seqno": 31124, "table_properties": {"data_size": 2007176, "index_size": 2987, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12560, "raw_average_key_size": 20, "raw_value_size": 1995757, "raw_average_value_size": 3234, "num_data_blocks": 128, "num_entries": 617, "num_filter_entries": 617, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163822, "oldest_key_time": 1769163822, "file_creation_time": 1769163913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 14917 microseconds, and 7019 cpu microseconds.
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032520) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2012832 bytes OK
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032548) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.034267) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.034287) EVENT_LOG_v1 {"time_micros": 1769163913034280, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.034312) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3158232, prev total WAL file size 3158232, number of live WAL files 2.
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.035704) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1965KB)], [57(12MB)]
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913035792, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 14774520, "oldest_snapshot_seqno": -1}
Jan 23 05:25:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.016000504s ======
Jan 23 05:25:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:13.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.016000504s
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5880 keys, 12616933 bytes, temperature: kUnknown
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913125569, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12616933, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12579081, "index_size": 22062, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14725, "raw_key_size": 152068, "raw_average_key_size": 25, "raw_value_size": 12474171, "raw_average_value_size": 2121, "num_data_blocks": 881, "num_entries": 5880, "num_filter_entries": 5880, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.125848) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12616933 bytes
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.128728) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.4 rd, 140.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.2 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(13.6) write-amplify(6.3) OK, records in: 6401, records dropped: 521 output_compression: NoCompression
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.128745) EVENT_LOG_v1 {"time_micros": 1769163913128737, "job": 34, "event": "compaction_finished", "compaction_time_micros": 89861, "compaction_time_cpu_micros": 25777, "output_level": 6, "num_output_files": 1, "total_output_size": 12616933, "num_input_records": 6401, "num_output_records": 5880, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913129179, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913131363, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.035554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131437) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131446) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131452) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:13 np0005593294 nova_compute[225705]: 2026-01-23 10:25:13.869 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:14 np0005593294 nova_compute[225705]: 2026-01-23 10:25:14.031 225709 DEBUG nova.network.neutron [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updated VIF entry in instance network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:25:14 np0005593294 nova_compute[225705]: 2026-01-23 10:25:14.032 225709 DEBUG nova.network.neutron [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:14 np0005593294 nova_compute[225705]: 2026-01-23 10:25:14.057 225709 DEBUG oslo_concurrency.lockutils [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:14.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:15.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:16.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:16 np0005593294 nova_compute[225705]: 2026-01-23 10:25:16.710 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:16 np0005593294 nova_compute[225705]: 2026-01-23 10:25:16.961 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:17.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:18.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:19.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:20 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:20Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 05:25:20 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:20Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 05:25:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:20.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:21.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:21 np0005593294 nova_compute[225705]: 2026-01-23 10:25:21.713 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:21 np0005593294 nova_compute[225705]: 2026-01-23 10:25:21.962 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:22.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:23.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:24.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:25.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:25 np0005593294 nova_compute[225705]: 2026-01-23 10:25:25.143 225709 INFO nova.compute.manager [None req-06ea0805-e3dd-4ccb-9753-3f247a501c58 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Get console output#033[00m
Jan 23 05:25:25 np0005593294 nova_compute[225705]: 2026-01-23 10:25:25.151 230072 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:25:25 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:25Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 05:25:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:26.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:26 np0005593294 podman[236457]: 2026-01-23 10:25:26.65938703 +0000 UTC m=+0.051900676 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:25:26 np0005593294 nova_compute[225705]: 2026-01-23 10:25:26.716 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:26 np0005593294 nova_compute[225705]: 2026-01-23 10:25:26.964 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:27.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:28 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:28Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 05:25:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:28.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:29.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:30.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:31.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:31 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:31Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 05:25:31 np0005593294 nova_compute[225705]: 2026-01-23 10:25:31.720 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:31 np0005593294 nova_compute[225705]: 2026-01-23 10:25:31.967 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.041 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.043 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.044 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.355 225709 DEBUG nova.compute.manager [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.356 225709 DEBUG nova.compute.manager [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing instance network info cache due to event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.357 225709 DEBUG oslo_concurrency.lockutils [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.357 225709 DEBUG oslo_concurrency.lockutils [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.357 225709 DEBUG nova.network.neutron [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.437 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.438 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.438 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.438 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.438 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.440 225709 INFO nova.compute.manager [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Terminating instance#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.441 225709 DEBUG nova.compute.manager [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:25:32 np0005593294 kernel: tapb35832ce-bd (unregistering): left promiscuous mode
Jan 23 05:25:32 np0005593294 NetworkManager[48978]: <info>  [1769163932.4978] device (tapb35832ce-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.505 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:32Z|00083|binding|INFO|Releasing lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e from this chassis (sb_readonly=0)
Jan 23 05:25:32 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:32Z|00084|binding|INFO|Setting lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e down in Southbound
Jan 23 05:25:32 np0005593294 ovn_controller[133293]: 2026-01-23T10:25:32Z|00085|binding|INFO|Removing iface tapb35832ce-bd ovn-installed in OVS
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.507 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.518 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:5a:1e 10.100.0.7'], port_security=['fa:16:3e:52:5a:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ee5670f1-f0fa-4c86-855a-ce14c49091ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7e7149f7-1f80-4cb0-a07a-4ad2ce209150', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79ce8e9f-5595-435d-b3c2-9a811b1982a6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.520 143098 INFO neutron.agent.ovn.metadata.agent [-] Port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e in datapath a36237e8-b709-4a50-8f8b-9cccdf12f329 unbound from our chassis#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.521 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a36237e8-b709-4a50-8f8b-9cccdf12f329, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.523 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[f107e192-6970-43ff-9fdd-53c01755983b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.523 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 namespace which is not needed anymore#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.528 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 23 05:25:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:32 np0005593294 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 14.303s CPU time.
Jan 23 05:25:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:32.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:32 np0005593294 systemd-machined[194551]: Machine qemu-4-instance-0000000a terminated.
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.669 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.678 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.685 225709 INFO nova.virt.libvirt.driver [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance destroyed successfully.#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.685 225709 DEBUG nova.objects.instance [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid ee5670f1-f0fa-4c86-855a-ce14c49091ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.709 225709 DEBUG nova.virt.libvirt.vif [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:24:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-201401353',display_name='tempest-TestNetworkBasicOps-server-201401353',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-201401353',id=10,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSiMVclS29CQGuWDzR4wqcRxghXwRX3OnxYcchVIhN1re6S5JbcUZIdPe1ViONpYjthNnTE0ukmKTamuv4VEW3D7ha0cmAwvhq7SF9xxubWvSPNpPeahMeeSWlQXgBMKw==',key_name='tempest-TestNetworkBasicOps-247222292',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:25:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-bwgfnixj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:25:06Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ee5670f1-f0fa-4c86-855a-ce14c49091ec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.710 225709 DEBUG nova.network.os_vif_util [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.711 225709 DEBUG nova.network.os_vif_util [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.712 225709 DEBUG os_vif [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.713 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.714 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb35832ce-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:32 np0005593294 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [NOTICE]   (236306) : haproxy version is 2.8.14-c23fe91
Jan 23 05:25:32 np0005593294 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [NOTICE]   (236306) : path to executable is /usr/sbin/haproxy
Jan 23 05:25:32 np0005593294 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [WARNING]  (236306) : Exiting Master process...
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.716 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [ALERT]    (236306) : Current worker (236308) exited with code 143 (Terminated)
Jan 23 05:25:32 np0005593294 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [WARNING]  (236306) : All workers exited. Exiting... (0)
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.718 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 systemd[1]: libpod-b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d.scope: Deactivated successfully.
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.724 225709 INFO os_vif [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd')#033[00m
Jan 23 05:25:32 np0005593294 podman[236502]: 2026-01-23 10:25:32.726267878 +0000 UTC m=+0.072432906 container died b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:25:32 np0005593294 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d-userdata-shm.mount: Deactivated successfully.
Jan 23 05:25:32 np0005593294 systemd[1]: var-lib-containers-storage-overlay-60ee88f003832c2d12671caf45936712538fa9185a4ea2311104f1ecd1bc220b-merged.mount: Deactivated successfully.
Jan 23 05:25:32 np0005593294 podman[236502]: 2026-01-23 10:25:32.771061842 +0000 UTC m=+0.117226900 container cleanup b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 05:25:32 np0005593294 systemd[1]: libpod-conmon-b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d.scope: Deactivated successfully.
Jan 23 05:25:32 np0005593294 podman[236561]: 2026-01-23 10:25:32.867275107 +0000 UTC m=+0.059046169 container remove b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.874 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce0b25c-a7b9-44fb-97fd-08d35bf380ba]: (4, ('Fri Jan 23 10:25:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 (b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d)\nb80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d\nFri Jan 23 10:25:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 (b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d)\nb80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.877 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1a29c67e-f622-4270-abc3-9ecfcdc9de1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.879 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa36237e8-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.882 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 kernel: tapa36237e8-b0: left promiscuous mode
Jan 23 05:25:32 np0005593294 nova_compute[225705]: 2026-01-23 10:25:32.905 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.911 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0f044922-d05b-4592-968a-64dc236b9457]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.931 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[821f446c-e57f-4d27-b1db-4c411816b48f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.934 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8d431976-2a11-4123-9eb2-3ba9e72b07c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.957 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[b0832980-2c64-4bcf-a100-36b3cb33aba4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505895, 'reachable_time': 28738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236576, 'error': None, 'target': 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.961 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:25:32 np0005593294 systemd[1]: run-netns-ovnmeta\x2da36237e8\x2db709\x2d4a50\x2d8f8b\x2d9cccdf12f329.mount: Deactivated successfully.
Jan 23 05:25:32 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.961 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[04d69142-fb67-4453-aa4f-013452e1e9b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:33.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:33 np0005593294 nova_compute[225705]: 2026-01-23 10:25:33.133 225709 INFO nova.virt.libvirt.driver [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Deleting instance files /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec_del#033[00m
Jan 23 05:25:33 np0005593294 nova_compute[225705]: 2026-01-23 10:25:33.134 225709 INFO nova.virt.libvirt.driver [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Deletion of /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec_del complete#033[00m
Jan 23 05:25:33 np0005593294 nova_compute[225705]: 2026-01-23 10:25:33.200 225709 INFO nova.compute.manager [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:25:33 np0005593294 nova_compute[225705]: 2026-01-23 10:25:33.200 225709 DEBUG oslo.service.loopingcall [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:25:33 np0005593294 nova_compute[225705]: 2026-01-23 10:25:33.201 225709 DEBUG nova.compute.manager [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:25:33 np0005593294 nova_compute[225705]: 2026-01-23 10:25:33.201 225709 DEBUG nova.network.neutron [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:25:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:34 np0005593294 nova_compute[225705]: 2026-01-23 10:25:34.480 225709 DEBUG nova.compute.manager [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-unplugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:34 np0005593294 nova_compute[225705]: 2026-01-23 10:25:34.480 225709 DEBUG oslo_concurrency.lockutils [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:34 np0005593294 nova_compute[225705]: 2026-01-23 10:25:34.480 225709 DEBUG oslo_concurrency.lockutils [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:34 np0005593294 nova_compute[225705]: 2026-01-23 10:25:34.481 225709 DEBUG oslo_concurrency.lockutils [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:34 np0005593294 nova_compute[225705]: 2026-01-23 10:25:34.481 225709 DEBUG nova.compute.manager [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] No waiting events found dispatching network-vif-unplugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:34 np0005593294 nova_compute[225705]: 2026-01-23 10:25:34.481 225709 DEBUG nova.compute.manager [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-unplugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:25:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:34.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:35.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:35 np0005593294 nova_compute[225705]: 2026-01-23 10:25:35.814 225709 DEBUG nova.network.neutron [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:35 np0005593294 nova_compute[225705]: 2026-01-23 10:25:35.835 225709 INFO nova.compute.manager [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Took 2.63 seconds to deallocate network for instance.#033[00m
Jan 23 05:25:35 np0005593294 nova_compute[225705]: 2026-01-23 10:25:35.877 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:35 np0005593294 nova_compute[225705]: 2026-01-23 10:25:35.878 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:35 np0005593294 nova_compute[225705]: 2026-01-23 10:25:35.931 225709 DEBUG oslo_concurrency.processutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:35 np0005593294 nova_compute[225705]: 2026-01-23 10:25:35.956 225709 DEBUG nova.network.neutron [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updated VIF entry in instance network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:25:35 np0005593294 nova_compute[225705]: 2026-01-23 10:25:35.957 225709 DEBUG nova.network.neutron [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.002 225709 DEBUG oslo_concurrency.lockutils [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:36 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:25:36 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/976421879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.396 225709 DEBUG oslo_concurrency.processutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.405 225709 DEBUG nova.compute.provider_tree [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.421 225709 DEBUG nova.scheduler.client.report [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.472 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.495 225709 INFO nova.scheduler.client.report [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance ee5670f1-f0fa-4c86-855a-ce14c49091ec#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.561 225709 DEBUG nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.562 225709 DEBUG oslo_concurrency.lockutils [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.563 225709 DEBUG oslo_concurrency.lockutils [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.563 225709 DEBUG oslo_concurrency.lockutils [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.563 225709 DEBUG nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] No waiting events found dispatching network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.563 225709 WARNING nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received unexpected event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.564 225709 DEBUG nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-deleted-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.564 225709 INFO nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Neutron deleted interface b35832ce-bd22-4306-81e2-4d6c9cc4fb5e; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.564 225709 DEBUG nova.network.neutron [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.568 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.590 225709 DEBUG nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Detach interface failed, port_id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e, reason: Instance ee5670f1-f0fa-4c86-855a-ce14c49091ec could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:25:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:36.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:36 np0005593294 nova_compute[225705]: 2026-01-23 10:25:36.722 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:37.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:37 np0005593294 nova_compute[225705]: 2026-01-23 10:25:37.718 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:38.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:39.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:39 np0005593294 nova_compute[225705]: 2026-01-23 10:25:39.869 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:39 np0005593294 nova_compute[225705]: 2026-01-23 10:25:39.985 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:40.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:41 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:41.047 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:41.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:41 np0005593294 nova_compute[225705]: 2026-01-23 10:25:41.726 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:42.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:42 np0005593294 podman[236606]: 2026-01-23 10:25:42.715777125 +0000 UTC m=+0.110607195 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:25:42 np0005593294 nova_compute[225705]: 2026-01-23 10:25:42.720 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:43.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:44.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 05:25:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:45.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 05:25:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:46.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:46 np0005593294 nova_compute[225705]: 2026-01-23 10:25:46.776 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:47.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:47 np0005593294 nova_compute[225705]: 2026-01-23 10:25:47.684 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163932.682029, ee5670f1-f0fa-4c86-855a-ce14c49091ec => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:47 np0005593294 nova_compute[225705]: 2026-01-23 10:25:47.684 225709 INFO nova.compute.manager [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:25:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102547 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:25:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [ALERT] 022/102547 (4) : backend 'backend' has no server available!
Jan 23 05:25:47 np0005593294 nova_compute[225705]: 2026-01-23 10:25:47.707 225709 DEBUG nova.compute.manager [None req-8ce47605-8964-4f06-9c06-e36a4c265841 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:47 np0005593294 nova_compute[225705]: 2026-01-23 10:25:47.724 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:48.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:49.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:50.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:51.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:51 np0005593294 nova_compute[225705]: 2026-01-23 10:25:51.779 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:52.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:52 np0005593294 nova_compute[225705]: 2026-01-23 10:25:52.726 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:53.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:54.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:55.056 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:55.057 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:25:55.057 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:55.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:56.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:56 np0005593294 nova_compute[225705]: 2026-01-23 10:25:56.781 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:25:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:25:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:25:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:25:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:57.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:57 np0005593294 podman[236666]: 2026-01-23 10:25:57.675774478 +0000 UTC m=+0.078821824 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:25:57 np0005593294 nova_compute[225705]: 2026-01-23 10:25:57.755 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:58.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:25:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:59.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:00.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:01.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:01 np0005593294 nova_compute[225705]: 2026-01-23 10:26:01.784 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:01 np0005593294 nova_compute[225705]: 2026-01-23 10:26:01.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:01 np0005593294 nova_compute[225705]: 2026-01-23 10:26:01.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:26:01 np0005593294 nova_compute[225705]: 2026-01-23 10:26:01.891 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:26:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:02.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:02 np0005593294 nova_compute[225705]: 2026-01-23 10:26:02.757 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:03.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:04.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:05.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:05 np0005593294 nova_compute[225705]: 2026-01-23 10:26:05.891 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:05 np0005593294 nova_compute[225705]: 2026-01-23 10:26:05.892 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:26:05 np0005593294 nova_compute[225705]: 2026-01-23 10:26:05.892 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:26:06 np0005593294 nova_compute[225705]: 2026-01-23 10:26:06.444 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:26:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:06.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:06 np0005593294 nova_compute[225705]: 2026-01-23 10:26:06.785 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:07.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:07 np0005593294 nova_compute[225705]: 2026-01-23 10:26:07.766 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:08 np0005593294 nova_compute[225705]: 2026-01-23 10:26:08.420 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:08.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:08 np0005593294 nova_compute[225705]: 2026-01-23 10:26:08.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:08 np0005593294 nova_compute[225705]: 2026-01-23 10:26:08.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:09.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:09 np0005593294 nova_compute[225705]: 2026-01-23 10:26:09.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:09 np0005593294 nova_compute[225705]: 2026-01-23 10:26:09.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:09 np0005593294 nova_compute[225705]: 2026-01-23 10:26:09.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:09 np0005593294 nova_compute[225705]: 2026-01-23 10:26:09.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:09 np0005593294 nova_compute[225705]: 2026-01-23 10:26:09.898 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:26:09 np0005593294 nova_compute[225705]: 2026-01-23 10:26:09.898 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:10 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:26:10 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2269378824' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:10 np0005593294 nova_compute[225705]: 2026-01-23 10:26:10.417 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:10 np0005593294 nova_compute[225705]: 2026-01-23 10:26:10.626 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:26:10 np0005593294 nova_compute[225705]: 2026-01-23 10:26:10.628 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4894MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:26:10 np0005593294 nova_compute[225705]: 2026-01-23 10:26:10.628 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:10 np0005593294 nova_compute[225705]: 2026-01-23 10:26:10.628 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:10.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:10 np0005593294 nova_compute[225705]: 2026-01-23 10:26:10.737 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:26:10 np0005593294 nova_compute[225705]: 2026-01-23 10:26:10.738 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:26:10 np0005593294 nova_compute[225705]: 2026-01-23 10:26:10.757 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:11.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:11 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:26:11 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/304457320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:11 np0005593294 nova_compute[225705]: 2026-01-23 10:26:11.270 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:11 np0005593294 nova_compute[225705]: 2026-01-23 10:26:11.278 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:26:11 np0005593294 nova_compute[225705]: 2026-01-23 10:26:11.305 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:26:11 np0005593294 nova_compute[225705]: 2026-01-23 10:26:11.329 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:26:11 np0005593294 nova_compute[225705]: 2026-01-23 10:26:11.330 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:11 np0005593294 nova_compute[225705]: 2026-01-23 10:26:11.331 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:11 np0005593294 nova_compute[225705]: 2026-01-23 10:26:11.331 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:26:11 np0005593294 podman[236887]: 2026-01-23 10:26:11.454314705 +0000 UTC m=+0.075720728 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:26:11 np0005593294 podman[236887]: 2026-01-23 10:26:11.565171716 +0000 UTC m=+0.186577659 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:26:11 np0005593294 nova_compute[225705]: 2026-01-23 10:26:11.788 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:12 np0005593294 podman[237004]: 2026-01-23 10:26:12.082560202 +0000 UTC m=+0.068441532 container exec 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:26:12 np0005593294 podman[237004]: 2026-01-23 10:26:12.100025655 +0000 UTC m=+0.085907005 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:26:12 np0005593294 nova_compute[225705]: 2026-01-23 10:26:12.344 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:12 np0005593294 nova_compute[225705]: 2026-01-23 10:26:12.345 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:12 np0005593294 nova_compute[225705]: 2026-01-23 10:26:12.345 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:12 np0005593294 nova_compute[225705]: 2026-01-23 10:26:12.345 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:26:12 np0005593294 podman[237096]: 2026-01-23 10:26:12.533058275 +0000 UTC m=+0.071501296 container exec 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:26:12 np0005593294 podman[237096]: 2026-01-23 10:26:12.548767885 +0000 UTC m=+0.087210906 container exec_died 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Jan 23 05:26:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:12.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:12 np0005593294 nova_compute[225705]: 2026-01-23 10:26:12.769 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:12 np0005593294 podman[237161]: 2026-01-23 10:26:12.839885657 +0000 UTC m=+0.068873586 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 05:26:12 np0005593294 podman[237161]: 2026-01-23 10:26:12.848288709 +0000 UTC m=+0.077276628 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 05:26:13 np0005593294 podman[237198]: 2026-01-23 10:26:13.027803617 +0000 UTC m=+0.105068692 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:26:13 np0005593294 podman[237250]: 2026-01-23 10:26:13.159360082 +0000 UTC m=+0.076483292 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, architecture=x86_64, io.buildah.version=1.28.2, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.expose-services=, com.redhat.component=keepalived-container, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 23 05:26:13 np0005593294 podman[237250]: 2026-01-23 10:26:13.177113354 +0000 UTC m=+0.094236594 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, com.redhat.component=keepalived-container, name=keepalived, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.buildah.version=1.28.2, vcs-type=git, vendor=Red Hat, Inc.)
Jan 23 05:26:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:13.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:13 np0005593294 nova_compute[225705]: 2026-01-23 10:26:13.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.300 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.301 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.319 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.375 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.376 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.382 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.383 225709 INFO nova.compute.claims [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.514 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:14.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:26:14 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1174453001' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:14 np0005593294 nova_compute[225705]: 2026-01-23 10:26:14.997 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.009 225709 DEBUG nova.compute.provider_tree [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.057 225709 DEBUG nova.scheduler.client.report [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.083 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.084 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.126 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.126 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.151 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.172 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:26:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:15.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.264 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.265 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.266 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Creating image(s)#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.298 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.333 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.370 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.374 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.466 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.468 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.469 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.469 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.507 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.513 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:15 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:15 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:15 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:26:15 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.805 225709 DEBUG nova.policy [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.808 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.295s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:15 np0005593294 nova_compute[225705]: 2026-01-23 10:26:15.882 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:26:16 np0005593294 nova_compute[225705]: 2026-01-23 10:26:16.000 225709 DEBUG nova.objects.instance [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid 7ca81dd2-d692-41ed-99b0-3046f49353ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:16 np0005593294 nova_compute[225705]: 2026-01-23 10:26:16.018 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:26:16 np0005593294 nova_compute[225705]: 2026-01-23 10:26:16.019 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Ensure instance console log exists: /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:26:16 np0005593294 nova_compute[225705]: 2026-01-23 10:26:16.020 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:16 np0005593294 nova_compute[225705]: 2026-01-23 10:26:16.021 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:16 np0005593294 nova_compute[225705]: 2026-01-23 10:26:16.021 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:16 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:16Z|00086|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 23 05:26:16 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:16 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:26:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:16.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:16 np0005593294 nova_compute[225705]: 2026-01-23 10:26:16.791 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:17.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:17 np0005593294 nova_compute[225705]: 2026-01-23 10:26:17.551 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Successfully created port: a8ceb3e7-8c43-461e-b444-6492e841b540 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:26:17 np0005593294 nova_compute[225705]: 2026-01-23 10:26:17.772 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:18.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:18 np0005593294 nova_compute[225705]: 2026-01-23 10:26:18.943 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Successfully updated port: a8ceb3e7-8c43-461e-b444-6492e841b540 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:26:18 np0005593294 nova_compute[225705]: 2026-01-23 10:26:18.959 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:18 np0005593294 nova_compute[225705]: 2026-01-23 10:26:18.960 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:18 np0005593294 nova_compute[225705]: 2026-01-23 10:26:18.960 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:26:19 np0005593294 nova_compute[225705]: 2026-01-23 10:26:19.037 225709 DEBUG nova.compute.manager [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:19 np0005593294 nova_compute[225705]: 2026-01-23 10:26:19.037 225709 DEBUG nova.compute.manager [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing instance network info cache due to event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:19 np0005593294 nova_compute[225705]: 2026-01-23 10:26:19.038 225709 DEBUG oslo_concurrency.lockutils [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:19 np0005593294 nova_compute[225705]: 2026-01-23 10:26:19.124 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:26:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:19.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.257 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.278 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.278 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance network_info: |[{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.278 225709 DEBUG oslo_concurrency.lockutils [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.279 225709 DEBUG nova.network.neutron [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.281 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start _get_guest_xml network_info=[{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encrypted': False, 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encryption_format': None, 'boot_index': 0, 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.286 225709 WARNING nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.294 225709 DEBUG nova.virt.libvirt.host [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.294 225709 DEBUG nova.virt.libvirt.host [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.306 225709 DEBUG nova.virt.libvirt.host [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.306 225709 DEBUG nova.virt.libvirt.host [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.307 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.307 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.307 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.309 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.309 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.309 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.309 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.312 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:20.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:20 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:26:20 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1766751452' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.814 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.847 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:20 np0005593294 nova_compute[225705]: 2026-01-23 10:26:20.851 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:20 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:20 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:21.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.217 225709 DEBUG nova.network.neutron [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updated VIF entry in instance network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.218 225709 DEBUG nova.network.neutron [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.237 225709 DEBUG oslo_concurrency.lockutils [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:21 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:26:21 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1512136197' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.303 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.305 225709 DEBUG nova.virt.libvirt.vif [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:26:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-693116214',display_name='tempest-TestNetworkBasicOps-server-693116214',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-693116214',id=12,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJjlNRjQ7HampCGTP/XTbCk9R3Ib+fDj6CfNlH4m79pVD9aYufMedp8ud5j/BRBY25VTiRpd/PxmVnv+wizUD3d3aoKtzcmvEyogkbp0bOIKJAePE4aMxKhzUKychHG7bA==',key_name='tempest-TestNetworkBasicOps-2017035378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-nxpw4fzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:15Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=7ca81dd2-d692-41ed-99b0-3046f49353ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.306 225709 DEBUG nova.network.os_vif_util [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.307 225709 DEBUG nova.network.os_vif_util [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.309 225709 DEBUG nova.objects.instance [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7ca81dd2-d692-41ed-99b0-3046f49353ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.322 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <uuid>7ca81dd2-d692-41ed-99b0-3046f49353ac</uuid>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <name>instance-0000000c</name>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <memory>131072</memory>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <vcpu>1</vcpu>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <metadata>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <nova:name>tempest-TestNetworkBasicOps-server-693116214</nova:name>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <nova:creationTime>2026-01-23 10:26:20</nova:creationTime>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <nova:flavor name="m1.nano">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <nova:memory>128</nova:memory>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <nova:disk>1</nova:disk>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <nova:swap>0</nova:swap>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      </nova:flavor>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <nova:owner>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      </nova:owner>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <nova:ports>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <nova:port uuid="a8ceb3e7-8c43-461e-b444-6492e841b540">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        </nova:port>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      </nova:ports>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    </nova:instance>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  </metadata>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <sysinfo type="smbios">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <system>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <entry name="serial">7ca81dd2-d692-41ed-99b0-3046f49353ac</entry>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <entry name="uuid">7ca81dd2-d692-41ed-99b0-3046f49353ac</entry>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    </system>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  </sysinfo>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <os>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <boot dev="hd"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <smbios mode="sysinfo"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  </os>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <features>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <acpi/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <apic/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <vmcoreinfo/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  </features>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <clock offset="utc">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <timer name="hpet" present="no"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  </clock>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <cpu mode="host-model" match="exact">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  </cpu>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  <devices>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <disk type="network" device="disk">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <driver type="raw" cache="none"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <source protocol="rbd" name="vms/7ca81dd2-d692-41ed-99b0-3046f49353ac_disk">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <auth username="openstack">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <target dev="vda" bus="virtio"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <disk type="network" device="cdrom">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <driver type="raw" cache="none"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <source protocol="rbd" name="vms/7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      </source>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <auth username="openstack">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      </auth>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <target dev="sda" bus="sata"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    </disk>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <interface type="ethernet">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <mac address="fa:16:3e:69:ef:f5"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <model type="virtio"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <mtu size="1442"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <target dev="tapa8ceb3e7-8c"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    </interface>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <serial type="pty">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <log file="/var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/console.log" append="off"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    </serial>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <video>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <model type="virtio"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    </video>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <input type="tablet" bus="usb"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <rng model="virtio">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    </rng>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <controller type="usb" index="0"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    <memballoon model="virtio">
Jan 23 05:26:21 np0005593294 nova_compute[225705]:      <stats period="10"/>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:    </memballoon>
Jan 23 05:26:21 np0005593294 nova_compute[225705]:  </devices>
Jan 23 05:26:21 np0005593294 nova_compute[225705]: </domain>
Jan 23 05:26:21 np0005593294 nova_compute[225705]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.323 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Preparing to wait for external event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.324 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.325 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.325 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.326 225709 DEBUG nova.virt.libvirt.vif [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:26:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-693116214',display_name='tempest-TestNetworkBasicOps-server-693116214',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-693116214',id=12,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJjlNRjQ7HampCGTP/XTbCk9R3Ib+fDj6CfNlH4m79pVD9aYufMedp8ud5j/BRBY25VTiRpd/PxmVnv+wizUD3d3aoKtzcmvEyogkbp0bOIKJAePE4aMxKhzUKychHG7bA==',key_name='tempest-TestNetworkBasicOps-2017035378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-nxpw4fzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:15Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=7ca81dd2-d692-41ed-99b0-3046f49353ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.327 225709 DEBUG nova.network.os_vif_util [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.328 225709 DEBUG nova.network.os_vif_util [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.329 225709 DEBUG os_vif [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.330 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.330 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.331 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.335 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.336 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8ceb3e7-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.337 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa8ceb3e7-8c, col_values=(('external_ids', {'iface-id': 'a8ceb3e7-8c43-461e-b444-6492e841b540', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:69:ef:f5', 'vm-uuid': '7ca81dd2-d692-41ed-99b0-3046f49353ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:21 np0005593294 NetworkManager[48978]: <info>  [1769163981.3402] manager: (tapa8ceb3e7-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.342 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.346 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.347 225709 INFO os_vif [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c')#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.403 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.403 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.404 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:69:ef:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.405 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Using config drive#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.444 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.794 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.903 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Creating config drive at /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config#033[00m
Jan 23 05:26:21 np0005593294 nova_compute[225705]: 2026-01-23 10:26:21.913 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8j_z160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.054 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8j_z160s" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.099 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.104 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.293 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.189s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.295 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Deleting local config drive /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config because it was imported into RBD.#033[00m
Jan 23 05:26:22 np0005593294 kernel: tapa8ceb3e7-8c: entered promiscuous mode
Jan 23 05:26:22 np0005593294 NetworkManager[48978]: <info>  [1769163982.3699] manager: (tapa8ceb3e7-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.370 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:22Z|00087|binding|INFO|Claiming lport a8ceb3e7-8c43-461e-b444-6492e841b540 for this chassis.
Jan 23 05:26:22 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:22Z|00088|binding|INFO|a8ceb3e7-8c43-461e-b444-6492e841b540: Claiming fa:16:3e:69:ef:f5 10.100.0.9
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.385 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593294 NetworkManager[48978]: <info>  [1769163982.3920] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.391 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593294 NetworkManager[48978]: <info>  [1769163982.3938] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.395 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:ef:f5 10.100.0.9'], port_security=['fa:16:3e:69:ef:f5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7ca81dd2-d692-41ed-99b0-3046f49353ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-712c0ef6-fbbe-4577-b44d-9610116b414a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0542887a-7598-408a-a342-24bd8aead651', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3547f5ca-ca7c-4ba0-a5f8-3ad2055eb8ec, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=a8ceb3e7-8c43-461e-b444-6492e841b540) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.398 143098 INFO neutron.agent.ovn.metadata.agent [-] Port a8ceb3e7-8c43-461e-b444-6492e841b540 in datapath 712c0ef6-fbbe-4577-b44d-9610116b414a bound to our chassis#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.400 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 712c0ef6-fbbe-4577-b44d-9610116b414a#033[00m
Jan 23 05:26:22 np0005593294 systemd-machined[194551]: New machine qemu-5-instance-0000000c.
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.416 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d64b232a-6ace-498c-8dbb-ad4c9d559e90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.418 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap712c0ef6-f1 in ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.419 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap712c0ef6-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.419 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0eb36935-0b8c-40b3-a28f-a6eb5e5662b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.420 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdf73fd-bf5a-4072-b2e2-895483ecf520]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.435 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[6161e87d-74b8-41d8-8299-5f60043ed618]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.464 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e8939828-0b37-4fae-b92d-5910f26e94ba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 systemd[1]: Started Virtual Machine qemu-5-instance-0000000c.
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.492 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.500 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.502 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[52ac4734-6d95-4d0b-bba5-63e3105d7b53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:22Z|00089|binding|INFO|Setting lport a8ceb3e7-8c43-461e-b444-6492e841b540 up in Southbound
Jan 23 05:26:22 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:22Z|00090|binding|INFO|Setting lport a8ceb3e7-8c43-461e-b444-6492e841b540 ovn-installed in OVS
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.508 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d45908ef-460c-4e0f-bca5-5c82eeb2c8ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.509 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593294 NetworkManager[48978]: <info>  [1769163982.5100] manager: (tap712c0ef6-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Jan 23 05:26:22 np0005593294 systemd-udevd[237724]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:26:22 np0005593294 systemd-udevd[237726]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:26:22 np0005593294 NetworkManager[48978]: <info>  [1769163982.5377] device (tapa8ceb3e7-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:26:22 np0005593294 NetworkManager[48978]: <info>  [1769163982.5384] device (tapa8ceb3e7-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.547 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef83f9c-ad0f-4518-8939-4d75a9d0df28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.550 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c0eda2cc-0df4-4327-883d-3c3e4aa0fd26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 NetworkManager[48978]: <info>  [1769163982.5760] device (tap712c0ef6-f0): carrier: link connected
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.583 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[4bba78c6-3c81-4bd3-84cc-d609dcc8a172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.609 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3fdc4bb3-67d3-4c99-a321-11617e059807]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap712c0ef6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:ec:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513655, 'reachable_time': 32308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237751, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.631 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad098e4-33a4-4725-b9de-e4d1dc05aeed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed0:ec06'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 513655, 'tstamp': 513655}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237752, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.665 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3258b7-4677-45a6-9438-a49cbc4e1f50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap712c0ef6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:ec:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513655, 'reachable_time': 32308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237753, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:22.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.696 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[dda21ece-7e99-48e0-9dc6-dec403cc344a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.756 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[39db4172-d722-4ade-bb7b-3f7ec12d293c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.757 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap712c0ef6-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.757 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.757 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap712c0ef6-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:22 np0005593294 kernel: tap712c0ef6-f0: entered promiscuous mode
Jan 23 05:26:22 np0005593294 NetworkManager[48978]: <info>  [1769163982.7602] manager: (tap712c0ef6-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.759 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.766 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap712c0ef6-f0, col_values=(('external_ids', {'iface-id': '6c333384-cae4-4f40-8b56-257e8d961c46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:22 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:22Z|00091|binding|INFO|Releasing lport 6c333384-cae4-4f40-8b56-257e8d961c46 from this chassis (sb_readonly=0)
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.768 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.780 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.780 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.781 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[86395a1f-84c3-4815-8ee5-7fc17f9d6b0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.782 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: global
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    log         /dev/log local0 debug
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    log-tag     haproxy-metadata-proxy-712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    user        root
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    group       root
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    maxconn     1024
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    pidfile     /var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    daemon
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: defaults
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    log global
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    mode http
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    option httplog
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    option dontlognull
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    option http-server-close
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    option forwardfor
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    retries                 3
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    timeout http-request    30s
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    timeout connect         30s
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    timeout client          32s
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    timeout server          32s
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    timeout http-keep-alive 30s
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: listen listener
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    bind 169.254.169.254:80
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]:    http-request add-header X-OVN-Network-ID 712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:26:22 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.782 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'env', 'PROCESS_TAG=haproxy-712c0ef6-fbbe-4577-b44d-9610116b414a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/712c0ef6-fbbe-4577-b44d-9610116b414a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.852 225709 DEBUG nova.compute.manager [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.853 225709 DEBUG oslo_concurrency.lockutils [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.854 225709 DEBUG oslo_concurrency.lockutils [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.854 225709 DEBUG oslo_concurrency.lockutils [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:22 np0005593294 nova_compute[225705]: 2026-01-23 10:26:22.855 225709 DEBUG nova.compute.manager [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Processing event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:26:23 np0005593294 podman[237786]: 2026-01-23 10:26:23.201389894 +0000 UTC m=+0.078261727 container create 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:26:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:23.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:23 np0005593294 systemd[1]: Started libpod-conmon-0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297.scope.
Jan 23 05:26:23 np0005593294 podman[237786]: 2026-01-23 10:26:23.164891308 +0000 UTC m=+0.041763211 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:26:23 np0005593294 systemd[1]: Started libcrun container.
Jan 23 05:26:23 np0005593294 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f38b4c9d64371d54aa104ef7a398b75b4fd9331f2fa9bb8717afb31f7a935f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:26:23 np0005593294 podman[237786]: 2026-01-23 10:26:23.308331322 +0000 UTC m=+0.185203255 container init 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:26:23 np0005593294 podman[237786]: 2026-01-23 10:26:23.317994814 +0000 UTC m=+0.194866667 container start 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:26:23 np0005593294 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [NOTICE]   (237803) : New worker (237818) forked
Jan 23 05:26:23 np0005593294 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [NOTICE]   (237803) : Loading success.
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.519 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163983.5185375, 7ca81dd2-d692-41ed-99b0-3046f49353ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.519 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] VM Started (Lifecycle Event)#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.522 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.526 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.530 225709 INFO nova.virt.libvirt.driver [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance spawned successfully.#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.531 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.542 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.547 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.555 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.556 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.557 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.557 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.558 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.558 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.568 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.569 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163983.5186481, 7ca81dd2-d692-41ed-99b0-3046f49353ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.569 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.591 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.594 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163983.5250778, 7ca81dd2-d692-41ed-99b0-3046f49353ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.595 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.626 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.630 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.634 225709 INFO nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Took 8.37 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.635 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.663 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.697 225709 INFO nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Took 9.34 seconds to build instance.#033[00m
Jan 23 05:26:23 np0005593294 nova_compute[225705]: 2026-01-23 10:26:23.733 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:24.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:24 np0005593294 nova_compute[225705]: 2026-01-23 10:26:24.936 225709 DEBUG nova.compute.manager [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:24 np0005593294 nova_compute[225705]: 2026-01-23 10:26:24.936 225709 DEBUG oslo_concurrency.lockutils [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:24 np0005593294 nova_compute[225705]: 2026-01-23 10:26:24.937 225709 DEBUG oslo_concurrency.lockutils [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:24 np0005593294 nova_compute[225705]: 2026-01-23 10:26:24.937 225709 DEBUG oslo_concurrency.lockutils [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:24 np0005593294 nova_compute[225705]: 2026-01-23 10:26:24.938 225709 DEBUG nova.compute.manager [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] No waiting events found dispatching network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:24 np0005593294 nova_compute[225705]: 2026-01-23 10:26:24.938 225709 WARNING nova.compute.manager [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received unexpected event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:26:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:25.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:26 np0005593294 nova_compute[225705]: 2026-01-23 10:26:26.342 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:26.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:26 np0005593294 nova_compute[225705]: 2026-01-23 10:26:26.796 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:27.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:28.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:28 np0005593294 podman[237884]: 2026-01-23 10:26:28.753806446 +0000 UTC m=+0.144326593 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:26:28 np0005593294 nova_compute[225705]: 2026-01-23 10:26:28.910 225709 DEBUG nova.compute.manager [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:28 np0005593294 nova_compute[225705]: 2026-01-23 10:26:28.911 225709 DEBUG nova.compute.manager [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing instance network info cache due to event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:28 np0005593294 nova_compute[225705]: 2026-01-23 10:26:28.911 225709 DEBUG oslo_concurrency.lockutils [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:28 np0005593294 nova_compute[225705]: 2026-01-23 10:26:28.911 225709 DEBUG oslo_concurrency.lockutils [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:28 np0005593294 nova_compute[225705]: 2026-01-23 10:26:28.911 225709 DEBUG nova.network.neutron [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:29.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:30 np0005593294 nova_compute[225705]: 2026-01-23 10:26:30.004 225709 DEBUG nova.network.neutron [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updated VIF entry in instance network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:30 np0005593294 nova_compute[225705]: 2026-01-23 10:26:30.005 225709 DEBUG nova.network.neutron [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:30 np0005593294 nova_compute[225705]: 2026-01-23 10:26:30.272 225709 DEBUG oslo_concurrency.lockutils [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:30.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:31.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:31 np0005593294 nova_compute[225705]: 2026-01-23 10:26:31.346 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:31 np0005593294 nova_compute[225705]: 2026-01-23 10:26:31.800 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:32.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:33.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:34.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:35.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:36 np0005593294 nova_compute[225705]: 2026-01-23 10:26:36.350 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:36.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:36 np0005593294 nova_compute[225705]: 2026-01-23 10:26:36.802 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:37 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:37Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:69:ef:f5 10.100.0.9
Jan 23 05:26:37 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:37Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:69:ef:f5 10.100.0.9
Jan 23 05:26:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:37.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:38.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:39.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:40.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:41.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:41 np0005593294 nova_compute[225705]: 2026-01-23 10:26:41.353 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:41 np0005593294 nova_compute[225705]: 2026-01-23 10:26:41.806 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:42.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:43.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:43 np0005593294 podman[237912]: 2026-01-23 10:26:43.703740198 +0000 UTC m=+0.104784302 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:26:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:44.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:45.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:46 np0005593294 nova_compute[225705]: 2026-01-23 10:26:46.356 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:46.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:46 np0005593294 nova_compute[225705]: 2026-01-23 10:26:46.843 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:47 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:46.999 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:26:47 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:47.001 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:26:47 np0005593294 nova_compute[225705]: 2026-01-23 10:26:47.001 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:47.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:48.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:49.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:50.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:51.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:51 np0005593294 nova_compute[225705]: 2026-01-23 10:26:51.360 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:51 np0005593294 nova_compute[225705]: 2026-01-23 10:26:51.847 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:52.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:53.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.382 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.382 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.383 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.383 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.383 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.384 225709 INFO nova.compute.manager [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Terminating instance#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.385 225709 DEBUG nova.compute.manager [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:26:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.874 225709 DEBUG nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.875 225709 DEBUG nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing instance network info cache due to event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.875 225709 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.875 225709 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:53 np0005593294 nova_compute[225705]: 2026-01-23 10:26:53.876 225709 DEBUG nova.network.neutron [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:54 np0005593294 kernel: tapa8ceb3e7-8c (unregistering): left promiscuous mode
Jan 23 05:26:54 np0005593294 NetworkManager[48978]: <info>  [1769164014.4634] device (tapa8ceb3e7-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:26:54 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:54Z|00092|binding|INFO|Releasing lport a8ceb3e7-8c43-461e-b444-6492e841b540 from this chassis (sb_readonly=0)
Jan 23 05:26:54 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:54Z|00093|binding|INFO|Setting lport a8ceb3e7-8c43-461e-b444-6492e841b540 down in Southbound
Jan 23 05:26:54 np0005593294 ovn_controller[133293]: 2026-01-23T10:26:54Z|00094|binding|INFO|Removing iface tapa8ceb3e7-8c ovn-installed in OVS
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.473 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.476 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.502 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:ef:f5 10.100.0.9'], port_security=['fa:16:3e:69:ef:f5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7ca81dd2-d692-41ed-99b0-3046f49353ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-712c0ef6-fbbe-4577-b44d-9610116b414a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0542887a-7598-408a-a342-24bd8aead651', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3547f5ca-ca7c-4ba0-a5f8-3ad2055eb8ec, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=a8ceb3e7-8c43-461e-b444-6492e841b540) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.503 143098 INFO neutron.agent.ovn.metadata.agent [-] Port a8ceb3e7-8c43-461e-b444-6492e841b540 in datapath 712c0ef6-fbbe-4577-b44d-9610116b414a unbound from our chassis#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.504 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 712c0ef6-fbbe-4577-b44d-9610116b414a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.507 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[587f6daf-c1a1-4f38-bc25-f1ecf54530f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.508 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a namespace which is not needed anymore#033[00m
Jan 23 05:26:54 np0005593294 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 23 05:26:54 np0005593294 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Consumed 14.819s CPU time.
Jan 23 05:26:54 np0005593294 systemd-machined[194551]: Machine qemu-5-instance-0000000c terminated.
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.620 225709 INFO nova.virt.libvirt.driver [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance destroyed successfully.#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.621 225709 DEBUG nova.objects.instance [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid 7ca81dd2-d692-41ed-99b0-3046f49353ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.649 225709 DEBUG nova.virt.libvirt.vif [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:26:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-693116214',display_name='tempest-TestNetworkBasicOps-server-693116214',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-693116214',id=12,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJjlNRjQ7HampCGTP/XTbCk9R3Ib+fDj6CfNlH4m79pVD9aYufMedp8ud5j/BRBY25VTiRpd/PxmVnv+wizUD3d3aoKtzcmvEyogkbp0bOIKJAePE4aMxKhzUKychHG7bA==',key_name='tempest-TestNetworkBasicOps-2017035378',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:26:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-nxpw4fzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:26:23Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=7ca81dd2-d692-41ed-99b0-3046f49353ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.649 225709 DEBUG nova.network.os_vif_util [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.650 225709 DEBUG nova.network.os_vif_util [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.650 225709 DEBUG os_vif [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.652 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.652 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8ceb3e7-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.653 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.655 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593294 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [NOTICE]   (237803) : haproxy version is 2.8.14-c23fe91
Jan 23 05:26:54 np0005593294 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [NOTICE]   (237803) : path to executable is /usr/sbin/haproxy
Jan 23 05:26:54 np0005593294 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [WARNING]  (237803) : Exiting Master process...
Jan 23 05:26:54 np0005593294 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [WARNING]  (237803) : Exiting Master process...
Jan 23 05:26:54 np0005593294 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [ALERT]    (237803) : Current worker (237818) exited with code 143 (Terminated)
Jan 23 05:26:54 np0005593294 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [WARNING]  (237803) : All workers exited. Exiting... (0)
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.658 225709 INFO os_vif [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c')#033[00m
Jan 23 05:26:54 np0005593294 systemd[1]: libpod-0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297.scope: Deactivated successfully.
Jan 23 05:26:54 np0005593294 podman[237993]: 2026-01-23 10:26:54.667198122 +0000 UTC m=+0.057348865 container died 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:26:54 np0005593294 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297-userdata-shm.mount: Deactivated successfully.
Jan 23 05:26:54 np0005593294 systemd[1]: var-lib-containers-storage-overlay-7f38b4c9d64371d54aa104ef7a398b75b4fd9331f2fa9bb8717afb31f7a935f2-merged.mount: Deactivated successfully.
Jan 23 05:26:54 np0005593294 podman[237993]: 2026-01-23 10:26:54.70952779 +0000 UTC m=+0.099678533 container cleanup 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:26:54 np0005593294 systemd[1]: libpod-conmon-0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297.scope: Deactivated successfully.
Jan 23 05:26:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:54.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:54 np0005593294 podman[238052]: 2026-01-23 10:26:54.779371594 +0000 UTC m=+0.047609853 container remove 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.784 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c46d2b15-b1d9-49b1-8e2a-b8faafebf4d6]: (4, ('Fri Jan 23 10:26:54 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a (0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297)\n0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297\nFri Jan 23 10:26:54 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a (0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297)\n0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.787 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[6d01806c-f6f7-40de-b21c-192177a8de8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.788 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap712c0ef6-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.790 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593294 kernel: tap712c0ef6-f0: left promiscuous mode
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.804 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593294 nova_compute[225705]: 2026-01-23 10:26:54.806 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.809 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[2af8715b-cd4f-430a-91a8-0e9436dc3580]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.826 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb88e39-8077-4468-9c82-300e59d8b804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.827 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[28ee66bd-82c7-4535-91b1-49a65d828838]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.842 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d08f0f-464f-4d92-87e8-521f93915fd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513647, 'reachable_time': 39515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238067, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:54 np0005593294 systemd[1]: run-netns-ovnmeta\x2d712c0ef6\x2dfbbe\x2d4577\x2db44d\x2d9610116b414a.mount: Deactivated successfully.
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.849 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:26:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.850 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[47bad867-d72a-4c77-a8e5-76be91908433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:55.057 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:55 np0005593294 nova_compute[225705]: 2026-01-23 10:26:55.112 225709 DEBUG nova.network.neutron [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updated VIF entry in instance network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:55 np0005593294 nova_compute[225705]: 2026-01-23 10:26:55.113 225709 DEBUG nova.network.neutron [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:55 np0005593294 nova_compute[225705]: 2026-01-23 10:26:55.163 225709 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:55.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:56 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:26:56.003 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.295 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-unplugged-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.296 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.296 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.297 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.297 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] No waiting events found dispatching network-vif-unplugged-a8ceb3e7-8c43-461e-b444-6492e841b540 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.297 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-unplugged-a8ceb3e7-8c43-461e-b444-6492e841b540 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.297 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.298 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.298 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.298 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.298 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] No waiting events found dispatching network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.299 225709 WARNING nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received unexpected event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:26:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:56.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:56 np0005593294 nova_compute[225705]: 2026-01-23 10:26:56.850 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:26:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:26:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:26:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:26:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:57.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:58.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:26:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:59.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:59 np0005593294 nova_compute[225705]: 2026-01-23 10:26:59.287 225709 INFO nova.virt.libvirt.driver [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Deleting instance files /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac_del#033[00m
Jan 23 05:26:59 np0005593294 nova_compute[225705]: 2026-01-23 10:26:59.288 225709 INFO nova.virt.libvirt.driver [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Deletion of /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac_del complete#033[00m
Jan 23 05:26:59 np0005593294 nova_compute[225705]: 2026-01-23 10:26:59.510 225709 INFO nova.compute.manager [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Took 6.12 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:26:59 np0005593294 nova_compute[225705]: 2026-01-23 10:26:59.511 225709 DEBUG oslo.service.loopingcall [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:26:59 np0005593294 nova_compute[225705]: 2026-01-23 10:26:59.511 225709 DEBUG nova.compute.manager [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:26:59 np0005593294 nova_compute[225705]: 2026-01-23 10:26:59.511 225709 DEBUG nova.network.neutron [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:26:59 np0005593294 nova_compute[225705]: 2026-01-23 10:26:59.653 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:59 np0005593294 podman[238072]: 2026-01-23 10:26:59.69222588 +0000 UTC m=+0.079880538 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 05:27:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:00.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.061 225709 DEBUG nova.network.neutron [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:01.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.301 225709 DEBUG nova.compute.manager [req-dffdcd5a-b4c7-4fac-8cb2-0ee0db6e78d3 req-6e08f62a-9c1a-4135-b60a-73df7ab4b14e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-deleted-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.302 225709 INFO nova.compute.manager [req-dffdcd5a-b4c7-4fac-8cb2-0ee0db6e78d3 req-6e08f62a-9c1a-4135-b60a-73df7ab4b14e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Neutron deleted interface a8ceb3e7-8c43-461e-b444-6492e841b540; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.302 225709 DEBUG nova.network.neutron [req-dffdcd5a-b4c7-4fac-8cb2-0ee0db6e78d3 req-6e08f62a-9c1a-4135-b60a-73df7ab4b14e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.335 225709 INFO nova.compute.manager [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Took 1.82 seconds to deallocate network for instance.#033[00m
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.342 225709 DEBUG nova.compute.manager [req-dffdcd5a-b4c7-4fac-8cb2-0ee0db6e78d3 req-6e08f62a-9c1a-4135-b60a-73df7ab4b14e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Detach interface failed, port_id=a8ceb3e7-8c43-461e-b444-6492e841b540, reason: Instance 7ca81dd2-d692-41ed-99b0-3046f49353ac could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.471 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.471 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.596 225709 DEBUG oslo_concurrency.processutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:01 np0005593294 nova_compute[225705]: 2026-01-23 10:27:01.852 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:27:02 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/138703590' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:02 np0005593294 nova_compute[225705]: 2026-01-23 10:27:02.075 225709 DEBUG oslo_concurrency.processutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:02 np0005593294 nova_compute[225705]: 2026-01-23 10:27:02.082 225709 DEBUG nova.compute.provider_tree [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:27:02 np0005593294 nova_compute[225705]: 2026-01-23 10:27:02.576 225709 DEBUG nova.scheduler.client.report [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:27:02 np0005593294 nova_compute[225705]: 2026-01-23 10:27:02.740 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:02.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:02 np0005593294 nova_compute[225705]: 2026-01-23 10:27:02.879 225709 INFO nova.scheduler.client.report [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance 7ca81dd2-d692-41ed-99b0-3046f49353ac#033[00m
Jan 23 05:27:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:03.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:04 np0005593294 nova_compute[225705]: 2026-01-23 10:27:04.263 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:04 np0005593294 nova_compute[225705]: 2026-01-23 10:27:04.656 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:04.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:05.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:06.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:06 np0005593294 nova_compute[225705]: 2026-01-23 10:27:06.856 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:06 np0005593294 nova_compute[225705]: 2026-01-23 10:27:06.897 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:06 np0005593294 nova_compute[225705]: 2026-01-23 10:27:06.897 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:27:06 np0005593294 nova_compute[225705]: 2026-01-23 10:27:06.898 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:27:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:07.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:07 np0005593294 nova_compute[225705]: 2026-01-23 10:27:07.446 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:27:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:08.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:09.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:09 np0005593294 nova_compute[225705]: 2026-01-23 10:27:09.419 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:09 np0005593294 nova_compute[225705]: 2026-01-23 10:27:09.618 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164014.6176653, 7ca81dd2-d692-41ed-99b0-3046f49353ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:09 np0005593294 nova_compute[225705]: 2026-01-23 10:27:09.619 225709 INFO nova.compute.manager [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:27:09 np0005593294 nova_compute[225705]: 2026-01-23 10:27:09.658 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:09 np0005593294 nova_compute[225705]: 2026-01-23 10:27:09.986 225709 DEBUG nova.compute.manager [None req-0c708697-ab87-444a-869f-e1d106eb6707 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:10.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:10 np0005593294 nova_compute[225705]: 2026-01-23 10:27:10.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:10 np0005593294 nova_compute[225705]: 2026-01-23 10:27:10.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:11.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:11 np0005593294 nova_compute[225705]: 2026-01-23 10:27:11.859 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:11 np0005593294 nova_compute[225705]: 2026-01-23 10:27:11.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:11 np0005593294 nova_compute[225705]: 2026-01-23 10:27:11.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:11 np0005593294 nova_compute[225705]: 2026-01-23 10:27:11.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:11 np0005593294 nova_compute[225705]: 2026-01-23 10:27:11.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:11 np0005593294 nova_compute[225705]: 2026-01-23 10:27:11.902 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:27:11 np0005593294 nova_compute[225705]: 2026-01-23 10:27:11.903 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:27:12 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1204745815' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:12 np0005593294 nova_compute[225705]: 2026-01-23 10:27:12.436 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:12 np0005593294 nova_compute[225705]: 2026-01-23 10:27:12.614 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:27:12 np0005593294 nova_compute[225705]: 2026-01-23 10:27:12.615 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4885MB free_disk=59.94270324707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:27:12 np0005593294 nova_compute[225705]: 2026-01-23 10:27:12.615 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:12 np0005593294 nova_compute[225705]: 2026-01-23 10:27:12.615 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:12 np0005593294 nova_compute[225705]: 2026-01-23 10:27:12.684 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:27:12 np0005593294 nova_compute[225705]: 2026-01-23 10:27:12.685 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:27:12 np0005593294 nova_compute[225705]: 2026-01-23 10:27:12.705 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:12.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:27:13 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1089974770' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:13 np0005593294 nova_compute[225705]: 2026-01-23 10:27:13.167 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:13 np0005593294 nova_compute[225705]: 2026-01-23 10:27:13.173 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:27:13 np0005593294 nova_compute[225705]: 2026-01-23 10:27:13.189 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:27:13 np0005593294 nova_compute[225705]: 2026-01-23 10:27:13.212 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:27:13 np0005593294 nova_compute[225705]: 2026-01-23 10:27:13.213 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:13.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:14 np0005593294 nova_compute[225705]: 2026-01-23 10:27:14.213 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:14 np0005593294 nova_compute[225705]: 2026-01-23 10:27:14.214 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:14 np0005593294 nova_compute[225705]: 2026-01-23 10:27:14.214 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:14 np0005593294 nova_compute[225705]: 2026-01-23 10:27:14.215 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:27:14 np0005593294 nova_compute[225705]: 2026-01-23 10:27:14.695 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:14.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:14 np0005593294 podman[238193]: 2026-01-23 10:27:14.778446534 +0000 UTC m=+0.163231422 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:27:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:15.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:15 np0005593294 nova_compute[225705]: 2026-01-23 10:27:15.870 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:15 np0005593294 nova_compute[225705]: 2026-01-23 10:27:15.888 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:16.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:16 np0005593294 nova_compute[225705]: 2026-01-23 10:27:16.861 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:17.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:18.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:19.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:19 np0005593294 nova_compute[225705]: 2026-01-23 10:27:19.698 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:20.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:21.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:21 np0005593294 nova_compute[225705]: 2026-01-23 10:27:21.862 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:22.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:23.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:24 np0005593294 nova_compute[225705]: 2026-01-23 10:27:24.701 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:24.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:25.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:25 np0005593294 nova_compute[225705]: 2026-01-23 10:27:25.681 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:25 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:25 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:25 np0005593294 nova_compute[225705]: 2026-01-23 10:27:25.789 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:27:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:26 np0005593294 ceph-mon[80126]: Health check update: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 05:27:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:26 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:27:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:26.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:26 np0005593294 nova_compute[225705]: 2026-01-23 10:27:26.901 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:27.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:28.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:29.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:29 np0005593294 nova_compute[225705]: 2026-01-23 10:27:29.703 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:30 np0005593294 podman[238334]: 2026-01-23 10:27:30.677928094 +0000 UTC m=+0.069328577 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:27:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:30.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:31.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:31 np0005593294 nova_compute[225705]: 2026-01-23 10:27:31.905 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:32 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:32 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:32.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:33.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:34 np0005593294 nova_compute[225705]: 2026-01-23 10:27:34.705 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:34.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:35.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:36.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:36 np0005593294 nova_compute[225705]: 2026-01-23 10:27:36.938 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:37.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:38.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:39.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:39 np0005593294 nova_compute[225705]: 2026-01-23 10:27:39.707 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:40.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:41.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:41 np0005593294 nova_compute[225705]: 2026-01-23 10:27:41.943 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:42.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:43.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:44 np0005593294 nova_compute[225705]: 2026-01-23 10:27:44.710 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:44.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:45.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:45 np0005593294 podman[238388]: 2026-01-23 10:27:45.724304048 +0000 UTC m=+0.114750401 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:27:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:46.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:46 np0005593294 nova_compute[225705]: 2026-01-23 10:27:46.946 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:47.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:47 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:27:47.570 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:47 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:27:47.571 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:27:47 np0005593294 nova_compute[225705]: 2026-01-23 10:27:47.572 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:48.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:49.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:49 np0005593294 nova_compute[225705]: 2026-01-23 10:27:49.712 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:50.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:51.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:51 np0005593294 nova_compute[225705]: 2026-01-23 10:27:51.948 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:52.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:53.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:54 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:27:54.572 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:54 np0005593294 nova_compute[225705]: 2026-01-23 10:27:54.716 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:54.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:27:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:27:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:27:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:55.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:56.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:56 np0005593294 nova_compute[225705]: 2026-01-23 10:27:56.951 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:27:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:27:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:27:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:27:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:57.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:58.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:27:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:59.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:59 np0005593294 nova_compute[225705]: 2026-01-23 10:27:59.761 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:00.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:01.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:01 np0005593294 podman[238448]: 2026-01-23 10:28:01.680666878 +0000 UTC m=+0.070871575 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:28:01 np0005593294 nova_compute[225705]: 2026-01-23 10:28:01.997 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:02.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:03.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:04 np0005593294 nova_compute[225705]: 2026-01-23 10:28:04.764 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:04.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:05.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:06.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:07 np0005593294 nova_compute[225705]: 2026-01-23 10:28:07.000 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:07 np0005593294 ovn_controller[133293]: 2026-01-23T10:28:07Z|00095|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 05:28:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:07.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:07 np0005593294 nova_compute[225705]: 2026-01-23 10:28:07.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:07 np0005593294 nova_compute[225705]: 2026-01-23 10:28:07.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:28:07 np0005593294 nova_compute[225705]: 2026-01-23 10:28:07.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:28:08 np0005593294 nova_compute[225705]: 2026-01-23 10:28:08.065 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:28:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:08.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:09 np0005593294 nova_compute[225705]: 2026-01-23 10:28:09.061 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:09.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:09 np0005593294 nova_compute[225705]: 2026-01-23 10:28:09.766 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:10.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:11.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:11 np0005593294 nova_compute[225705]: 2026-01-23 10:28:11.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:11 np0005593294 nova_compute[225705]: 2026-01-23 10:28:11.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:11 np0005593294 nova_compute[225705]: 2026-01-23 10:28:11.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:11 np0005593294 nova_compute[225705]: 2026-01-23 10:28:11.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:11 np0005593294 nova_compute[225705]: 2026-01-23 10:28:11.903 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:28:11 np0005593294 nova_compute[225705]: 2026-01-23 10:28:11.904 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:12 np0005593294 nova_compute[225705]: 2026-01-23 10:28:12.004 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:28:12 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/615719513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:12 np0005593294 nova_compute[225705]: 2026-01-23 10:28:12.464 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:12 np0005593294 nova_compute[225705]: 2026-01-23 10:28:12.618 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:28:12 np0005593294 nova_compute[225705]: 2026-01-23 10:28:12.620 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4907MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:28:12 np0005593294 nova_compute[225705]: 2026-01-23 10:28:12.620 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:12 np0005593294 nova_compute[225705]: 2026-01-23 10:28:12.620 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:12 np0005593294 nova_compute[225705]: 2026-01-23 10:28:12.824 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:28:12 np0005593294 nova_compute[225705]: 2026-01-23 10:28:12.825 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:28:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:12.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:12 np0005593294 nova_compute[225705]: 2026-01-23 10:28:12.956 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:13.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:28:13 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3161038119' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:13 np0005593294 nova_compute[225705]: 2026-01-23 10:28:13.477 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:13 np0005593294 nova_compute[225705]: 2026-01-23 10:28:13.484 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:28:13 np0005593294 nova_compute[225705]: 2026-01-23 10:28:13.509 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:28:13 np0005593294 nova_compute[225705]: 2026-01-23 10:28:13.512 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:28:13 np0005593294 nova_compute[225705]: 2026-01-23 10:28:13.512 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:14 np0005593294 nova_compute[225705]: 2026-01-23 10:28:14.512 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:14 np0005593294 nova_compute[225705]: 2026-01-23 10:28:14.512 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:14 np0005593294 nova_compute[225705]: 2026-01-23 10:28:14.513 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:14 np0005593294 nova_compute[225705]: 2026-01-23 10:28:14.513 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:14 np0005593294 nova_compute[225705]: 2026-01-23 10:28:14.513 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:28:14 np0005593294 nova_compute[225705]: 2026-01-23 10:28:14.768 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:14.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:14 np0005593294 nova_compute[225705]: 2026-01-23 10:28:14.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:15.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:15 np0005593294 nova_compute[225705]: 2026-01-23 10:28:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:16 np0005593294 podman[238543]: 2026-01-23 10:28:16.69485291 +0000 UTC m=+0.097310905 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:28:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:16.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:17 np0005593294 nova_compute[225705]: 2026-01-23 10:28:17.006 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:17.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:18.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:19.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:19 np0005593294 nova_compute[225705]: 2026-01-23 10:28:19.771 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:20.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:21.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:22 np0005593294 nova_compute[225705]: 2026-01-23 10:28:22.009 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:22 np0005593294 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 05:28:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:22.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:23.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:24 np0005593294 nova_compute[225705]: 2026-01-23 10:28:24.813 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:24.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:25.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:26.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:27 np0005593294 nova_compute[225705]: 2026-01-23 10:28:27.012 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:27.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:28.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:29.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:29 np0005593294 nova_compute[225705]: 2026-01-23 10:28:29.815 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:30.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:31.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:31 np0005593294 podman[238625]: 2026-01-23 10:28:31.912780293 +0000 UTC m=+0.091454056 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:28:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:32 np0005593294 nova_compute[225705]: 2026-01-23 10:28:32.015 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:32.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:33.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:33 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:28:33 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:28:33 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:28:33 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:28:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:34 np0005593294 nova_compute[225705]: 2026-01-23 10:28:34.817 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:34.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:35.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:36.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:37 np0005593294 nova_compute[225705]: 2026-01-23 10:28:37.017 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:37.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:37 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:28:37 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:28:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:38.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:39.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:39 np0005593294 nova_compute[225705]: 2026-01-23 10:28:39.818 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:40.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:41.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:42 np0005593294 nova_compute[225705]: 2026-01-23 10:28:42.021 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:42.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:43 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:28:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:43.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:44 np0005593294 nova_compute[225705]: 2026-01-23 10:28:44.821 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:28:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:44.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:45.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:46.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:47 np0005593294 nova_compute[225705]: 2026-01-23 10:28:47.021 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:47 np0005593294 podman[238758]: 2026-01-23 10:28:47.207269552 +0000 UTC m=+0.118444654 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 23 05:28:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:47.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:48.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:49.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:49 np0005593294 nova_compute[225705]: 2026-01-23 10:28:49.823 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:50.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:28:50.976 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:28:50 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:28:50.977 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:28:50 np0005593294 nova_compute[225705]: 2026-01-23 10:28:50.977 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:51.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:52 np0005593294 nova_compute[225705]: 2026-01-23 10:28:52.024 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.761676) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132761768, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2389, "num_deletes": 251, "total_data_size": 6545311, "memory_usage": 6629904, "flush_reason": "Manual Compaction"}
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132785331, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4222889, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31129, "largest_seqno": 33513, "table_properties": {"data_size": 4213075, "index_size": 6244, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19896, "raw_average_key_size": 20, "raw_value_size": 4193741, "raw_average_value_size": 4305, "num_data_blocks": 264, "num_entries": 974, "num_filter_entries": 974, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163913, "oldest_key_time": 1769163913, "file_creation_time": 1769164132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 23867 microseconds, and 11703 cpu microseconds.
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.785543) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4222889 bytes OK
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.785629) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.787620) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.787637) EVENT_LOG_v1 {"time_micros": 1769164132787632, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.787659) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6534901, prev total WAL file size 6534901, number of live WAL files 2.
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.789698) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4123KB)], [60(12MB)]
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132789821, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16839822, "oldest_snapshot_seqno": -1}
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6336 keys, 14607576 bytes, temperature: kUnknown
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132891817, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14607576, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14564997, "index_size": 25637, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15877, "raw_key_size": 162344, "raw_average_key_size": 25, "raw_value_size": 14450534, "raw_average_value_size": 2280, "num_data_blocks": 1024, "num_entries": 6336, "num_filter_entries": 6336, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.892110) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14607576 bytes
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.894848) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.9 rd, 143.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.0 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.4) write-amplify(3.5) OK, records in: 6854, records dropped: 518 output_compression: NoCompression
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.894900) EVENT_LOG_v1 {"time_micros": 1769164132894879, "job": 36, "event": "compaction_finished", "compaction_time_micros": 102103, "compaction_time_cpu_micros": 30018, "output_level": 6, "num_output_files": 1, "total_output_size": 14607576, "num_input_records": 6854, "num_output_records": 6336, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132896544, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132899508, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.789512) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899690) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899697) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899699) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899701) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:52 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899703) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:52.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:53.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:54 np0005593294 nova_compute[225705]: 2026-01-23 10:28:54.826 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:54.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:28:55.059 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:28:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:28:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:55.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:56.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:28:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:28:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:28:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:28:57 np0005593294 nova_compute[225705]: 2026-01-23 10:28:57.026 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:57.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:58.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:28:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:59.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:59 np0005593294 nova_compute[225705]: 2026-01-23 10:28:59.828 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:00.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:00 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:29:00.980 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:29:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:01.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:02 np0005593294 nova_compute[225705]: 2026-01-23 10:29:02.053 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:02 np0005593294 podman[238793]: 2026-01-23 10:29:02.634927094 +0000 UTC m=+0.044673451 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:29:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:02.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:03.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:04 np0005593294 nova_compute[225705]: 2026-01-23 10:29:04.830 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:04.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:05.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:06.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:07 np0005593294 nova_compute[225705]: 2026-01-23 10:29:07.056 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:07.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:08.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:09.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:09 np0005593294 nova_compute[225705]: 2026-01-23 10:29:09.832 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:09 np0005593294 nova_compute[225705]: 2026-01-23 10:29:09.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:09 np0005593294 nova_compute[225705]: 2026-01-23 10:29:09.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:29:09 np0005593294 nova_compute[225705]: 2026-01-23 10:29:09.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:29:10 np0005593294 nova_compute[225705]: 2026-01-23 10:29:10.010 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:29:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:10.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:11 np0005593294 nova_compute[225705]: 2026-01-23 10:29:11.004 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:11.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:12 np0005593294 nova_compute[225705]: 2026-01-23 10:29:12.058 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:12.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:13.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:13 np0005593294 nova_compute[225705]: 2026-01-23 10:29:13.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:13 np0005593294 nova_compute[225705]: 2026-01-23 10:29:13.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:13 np0005593294 nova_compute[225705]: 2026-01-23 10:29:13.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:13 np0005593294 nova_compute[225705]: 2026-01-23 10:29:13.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:13 np0005593294 nova_compute[225705]: 2026-01-23 10:29:13.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:13 np0005593294 nova_compute[225705]: 2026-01-23 10:29:13.903 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:29:13 np0005593294 nova_compute[225705]: 2026-01-23 10:29:13.904 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:29:14 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3441369713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.431 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.631 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.632 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4914MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.633 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.633 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.697 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.697 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.720 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.738 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.739 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.756 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.783 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.802 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:14 np0005593294 nova_compute[225705]: 2026-01-23 10:29:14.834 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:29:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:14.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:29:15 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:29:15 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1986486844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:15 np0005593294 nova_compute[225705]: 2026-01-23 10:29:15.292 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:15 np0005593294 nova_compute[225705]: 2026-01-23 10:29:15.300 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:29:15 np0005593294 nova_compute[225705]: 2026-01-23 10:29:15.316 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:29:15 np0005593294 nova_compute[225705]: 2026-01-23 10:29:15.317 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:29:15 np0005593294 nova_compute[225705]: 2026-01-23 10:29:15.318 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:15.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:16 np0005593294 nova_compute[225705]: 2026-01-23 10:29:16.317 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593294 nova_compute[225705]: 2026-01-23 10:29:16.318 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593294 nova_compute[225705]: 2026-01-23 10:29:16.319 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593294 nova_compute[225705]: 2026-01-23 10:29:16.319 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593294 nova_compute[225705]: 2026-01-23 10:29:16.319 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:29:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:16.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:17 np0005593294 nova_compute[225705]: 2026-01-23 10:29:17.062 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:17.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:17 np0005593294 podman[238889]: 2026-01-23 10:29:17.698346787 +0000 UTC m=+0.105238739 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:29:17 np0005593294 nova_compute[225705]: 2026-01-23 10:29:17.870 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:17 np0005593294 nova_compute[225705]: 2026-01-23 10:29:17.892 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:18.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:19.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:19 np0005593294 nova_compute[225705]: 2026-01-23 10:29:19.837 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:20.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:21.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:22 np0005593294 nova_compute[225705]: 2026-01-23 10:29:22.063 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:22.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:23.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:24 np0005593294 nova_compute[225705]: 2026-01-23 10:29:24.839 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:24.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:25.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:26.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:27 np0005593294 nova_compute[225705]: 2026-01-23 10:29:27.064 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:27.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:28.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:29.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:29 np0005593294 nova_compute[225705]: 2026-01-23 10:29:29.842 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:30.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:31.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:32 np0005593294 nova_compute[225705]: 2026-01-23 10:29:32.066 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:32.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:33.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:33 np0005593294 podman[238950]: 2026-01-23 10:29:33.662812567 +0000 UTC m=+0.063520429 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:29:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:34 np0005593294 nova_compute[225705]: 2026-01-23 10:29:34.843 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:35.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:35.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:37.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:37 np0005593294 nova_compute[225705]: 2026-01-23 10:29:37.068 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:37.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:39.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:29:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:39 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:29:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:39.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:39 np0005593294 nova_compute[225705]: 2026-01-23 10:29:39.845 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:41.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:41.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:42 np0005593294 nova_compute[225705]: 2026-01-23 10:29:42.072 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:43.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:43.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:44 np0005593294 nova_compute[225705]: 2026-01-23 10:29:44.847 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:45.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:45 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:45 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:45.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:47.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:47 np0005593294 nova_compute[225705]: 2026-01-23 10:29:47.073 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:47.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 05:29:48 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2022542434' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:29:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 05:29:48 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2022542434' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:29:48 np0005593294 podman[239108]: 2026-01-23 10:29:48.711092044 +0000 UTC m=+0.105853907 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:29:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:49.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:49.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:49 np0005593294 nova_compute[225705]: 2026-01-23 10:29:49.848 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:51.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:51.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:29:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2404.3 total, 600.0 interval#012Cumulative writes: 13K writes, 49K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 13K writes, 4008 syncs, 3.45 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3081 writes, 10K keys, 3081 commit groups, 1.0 writes per commit group, ingest: 9.92 MB, 0.02 MB/s#012Interval WAL: 3081 writes, 1324 syncs, 2.33 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:29:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:52 np0005593294 nova_compute[225705]: 2026-01-23 10:29:52.077 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:53.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:53.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:54 np0005593294 nova_compute[225705]: 2026-01-23 10:29:54.851 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:55.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:29:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:29:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:29:55.061 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:55.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:56 np0005593294 systemd-logind[807]: New session 55 of user zuul.
Jan 23 05:29:56 np0005593294 systemd[1]: Started Session 55 of User zuul.
Jan 23 05:29:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:29:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:29:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:29:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:29:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:57.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:57 np0005593294 nova_compute[225705]: 2026-01-23 10:29:57.078 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:57.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:59.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:29:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:59.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:59 np0005593294 nova_compute[225705]: 2026-01-23 10:29:59.854 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:00 np0005593294 ceph-mon[80126]: Health detail: HEALTH_WARN 2 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Jan 23 05:30:00 np0005593294 ceph-mon[80126]: [WRN] BLUESTORE_SLOW_OP_ALERT: 2 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:30:00 np0005593294 ceph-mon[80126]:     osd.1 observed slow operation indications in BlueStore
Jan 23 05:30:00 np0005593294 ceph-mon[80126]:     osd.2 observed slow operation indications in BlueStore
Jan 23 05:30:00 np0005593294 ceph-mon[80126]: [WRN] CEPHADM_FAILED_DAEMON: 2 failed cephadm daemon(s)
Jan 23 05:30:00 np0005593294 ceph-mon[80126]:    daemon nfs.cephfs.2.0.compute-0.fenqiu on compute-0 is in error state
Jan 23 05:30:00 np0005593294 ceph-mon[80126]:    daemon nfs.cephfs.1.0.compute-2.tykohi on compute-2 is in error state
Jan 23 05:30:00 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 05:30:00 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1970715676' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 05:30:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:01.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:01.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:02 np0005593294 nova_compute[225705]: 2026-01-23 10:30:02.080 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:03.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:03.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:04 np0005593294 podman[239477]: 2026-01-23 10:30:04.663228104 +0000 UTC m=+0.058600598 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:30:04 np0005593294 nova_compute[225705]: 2026-01-23 10:30:04.857 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:05.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:05.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:06 np0005593294 ovs-vsctl[239527]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 05:30:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:07.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:07 np0005593294 nova_compute[225705]: 2026-01-23 10:30:07.083 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:07 np0005593294 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 05:30:07 np0005593294 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 05:30:07 np0005593294 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 05:30:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:07.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:08 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: cache status {prefix=cache status} (starting...)
Jan 23 05:30:08 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:08 np0005593294 lvm[239886]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 05:30:08 np0005593294 lvm[239886]: VG ceph_vg0 finished
Jan 23 05:30:08 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: client ls {prefix=client ls} (starting...)
Jan 23 05:30:08 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:08 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 05:30:08 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 23 05:30:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3709761161' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 05:30:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:09.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 23 05:30:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2096205967' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:09.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 05:30:09 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 23 05:30:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2048205722' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 05:30:09 np0005593294 nova_compute[225705]: 2026-01-23 10:30:09.859 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:10 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 05:30:10 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:10 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 05:30:10 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:10 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 23 05:30:10 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2632100089' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 05:30:10 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 23 05:30:10 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/604163341' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 05:30:10 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: ops {prefix=ops} (starting...)
Jan 23 05:30:10 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:10 np0005593294 nova_compute[225705]: 2026-01-23 10:30:10.890 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:11.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:11 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 05:30:11 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/447049587' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 05:30:11 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: session ls {prefix=session ls} (starting...)
Jan 23 05:30:11 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:30:11 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: status {prefix=status} (starting...)
Jan 23 05:30:11 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 23 05:30:11 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/661403047' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 05:30:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:11.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:11 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 05:30:11 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3873136511' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 05:30:11 np0005593294 nova_compute[225705]: 2026-01-23 10:30:11.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:11 np0005593294 nova_compute[225705]: 2026-01-23 10:30:11.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:30:11 np0005593294 nova_compute[225705]: 2026-01-23 10:30:11.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:30:11 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 23 05:30:11 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2326913399' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 05:30:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:12 np0005593294 nova_compute[225705]: 2026-01-23 10:30:12.085 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 05:30:12 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1159080727' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 05:30:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 05:30:12 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/728729314' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 05:30:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 05:30:12 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3971325804' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 05:30:12 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 23 05:30:12 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/591404194' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 05:30:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:13.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 23 05:30:13 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1309114209' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 05:30:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 23 05:30:13 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2372642057' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 05:30:13 np0005593294 nova_compute[225705]: 2026-01-23 10:30:13.572 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:30:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:13.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 05:30:13 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2049800932' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 05:30:13 np0005593294 nova_compute[225705]: 2026-01-23 10:30:13.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 05:30:13 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/600541726' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 05:30:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 05:30:14 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2345853878' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 05:30:14 np0005593294 nova_compute[225705]: 2026-01-23 10:30:14.861 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:14 np0005593294 nova_compute[225705]: 2026-01-23 10:30:14.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81485824 unmapped: 1310720 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81494016 unmapped: 1302528 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81494016 unmapped: 1302528 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 978492 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81518592 unmapped: 1277952 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81518592 unmapped: 1277952 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81518592 unmapped: 1277952 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980004 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.178896904s of 10.185409546s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979413 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81567744 unmapped: 1228800 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81567744 unmapped: 1228800 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81575936 unmapped: 1220608 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81575936 unmapped: 1220608 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81608704 unmapped: 1187840 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81608704 unmapped: 1187840 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81608704 unmapped: 1187840 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81616896 unmapped: 1179648 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81616896 unmapped: 1179648 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a55f919e00
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61c00 session 0x55a563070d20
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81485824 unmapped: 1310720 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81494016 unmapped: 1302528 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81494016 unmapped: 1302528 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 38.248489380s of 38.295757294s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979413 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81518592 unmapped: 1277952 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979413 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980925 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.098414421s of 12.469996452s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81567744 unmapped: 1228800 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980334 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81567744 unmapped: 1228800 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81575936 unmapped: 1220608 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81575936 unmapped: 1220608 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81600512 unmapped: 1196032 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81600512 unmapped: 1196032 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81608704 unmapped: 1187840 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81616896 unmapped: 1179648 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022c000 session 0x55a562ed3860
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a563070000
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81649664 unmapped: 1146880 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81649664 unmapped: 1146880 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81657856 unmapped: 1138688 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81657856 unmapped: 1138688 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81666048 unmapped: 1130496 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.839187622s of 28.427438736s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81666048 unmapped: 1130496 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81666048 unmapped: 1130496 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81674240 unmapped: 1122304 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81690624 unmapped: 1105920 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981846 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81690624 unmapped: 1105920 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81690624 unmapped: 1105920 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64c00 session 0x55a5602192c0
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d65000 session 0x55a562e5ef00
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 604.3 total, 600.0 interval#012Cumulative writes: 8279 writes, 33K keys, 8279 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 8279 writes, 1593 syncs, 5.20 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8279 writes, 33K keys, 8279 commit groups, 1.0 writes per commit group, ingest: 21.31 MB, 0.04 MB/s#012Interval WAL: 8279 writes, 1593 syncs, 5.20 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 604.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 604.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 604.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81764352 unmapped: 1032192 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81772544 unmapped: 1024000 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981846 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81772544 unmapped: 1024000 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81772544 unmapped: 1024000 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81788928 unmapped: 1007616 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.093015671s of 12.653417587s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81788928 unmapped: 1007616 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81797120 unmapped: 999424 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981123 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81797120 unmapped: 999424 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981255 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81854464 unmapped: 942080 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81854464 unmapped: 942080 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981255 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81911808 unmapped: 884736 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81911808 unmapped: 884736 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981255 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.302114487s of 17.311687469s, submitted: 3
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81928192 unmapped: 868352 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81928192 unmapped: 868352 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81936384 unmapped: 860160 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81936384 unmapped: 860160 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81944576 unmapped: 851968 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81944576 unmapped: 851968 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 843776 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 843776 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 843776 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81960960 unmapped: 835584 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81960960 unmapped: 835584 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81969152 unmapped: 827392 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81969152 unmapped: 827392 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81969152 unmapped: 827392 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81985536 unmapped: 811008 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81985536 unmapped: 811008 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81993728 unmapped: 802816 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81993728 unmapped: 802816 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81993728 unmapped: 802816 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81993728 unmapped: 802816 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82018304 unmapped: 778240 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82018304 unmapped: 778240 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread fragmentation_score=0.000033 took=0.000313s
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 761856 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 761856 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 761856 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82042880 unmapped: 753664 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82042880 unmapped: 753664 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82051072 unmapped: 745472 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82051072 unmapped: 745472 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 737280 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 737280 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 737280 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 729088 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 729088 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 704512 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 704512 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 696320 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 696320 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 696320 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 688128 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 688128 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82116608 unmapped: 679936 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82116608 unmapped: 679936 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 671744 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 671744 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 671744 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 663552 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 663552 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 655360 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64000 session 0x55a56017a3c0
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d43400 session 0x55a56327a3c0
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 655360 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 655360 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 647168 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 647168 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 638976 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 638976 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 638976 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 630784 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 630784 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 622592 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 622592 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 75.745262146s of 75.806076050s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 622592 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 614400 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 614400 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 606208 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 982176 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 606208 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 598016 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 598016 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 589824 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 589824 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983688 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 589824 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 581632 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 581632 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 573440 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 573440 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983688 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82231296 unmapped: 565248 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82231296 unmapped: 565248 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 557056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.955039978s of 17.089204788s, submitted: 3
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 557056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 557056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983556 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82247680 unmapped: 548864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 540672 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983556 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d67400 session 0x55a56327a000
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a563071860
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983556 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983556 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.438745499s of 20.442741394s, submitted: 1
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983688 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985200 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984609 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.390316963s of 12.403103828s, submitted: 3
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 516096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 516096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 516096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 516096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983886 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983886 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a56103d680
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d43400 session 0x55a56327b860
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.533004761s of 13.541749001s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983973 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 1638400 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82247680 unmapped: 1597440 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 1474560 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983886 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 7.916155815s of 10.561954498s, submitted: 354
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984018 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984018 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983427 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983427 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.676980972s of 15.957923889s, submitted: 16
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d66800 session 0x55a562ed21e0
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d56000 session 0x55a560eef4a0
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022c000 session 0x55a562e565a0
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64000 session 0x55a56327be00
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 59.252044678s of 59.258327484s, submitted: 1
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983427 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983559 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983559 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.363295555s of 12.371066093s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985071 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a5601c9c20
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 71.638946533s of 71.669204712s, submitted: 3
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984939 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986451 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985860 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.976454735s of 15.996831894s, submitted: 3
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82567168 unmapped: 1277952 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82567168 unmapped: 1277952 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d43400 session 0x55a5601ca960
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64c00 session 0x55a55f919a40
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.710205078s of 28.833591461s, submitted: 1
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985860 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987372 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.073184013s of 12.080324173s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986781 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a562ed32c0
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a56327a000
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d67400 session 0x55a560ef7c20
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 35.307456970s of 35.356937408s, submitted: 2
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986913 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 988425 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.996991158s of 14.014451981s, submitted: 3
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987702 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 nova_compute[225705]: 2026-01-23 10:30:14.980 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:14 np0005593294 nova_compute[225705]: 2026-01-23 10:30:14.980 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:14 np0005593294 nova_compute[225705]: 2026-01-23 10:30:14.980 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:14 np0005593294 nova_compute[225705]: 2026-01-23 10:30:14.981 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:30:14 np0005593294 nova_compute[225705]: 2026-01-23 10:30:14.981 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d43400 session 0x55a560e96780
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 52.138698578s of 52.151603699s, submitted: 3
Jan 23 05:30:14 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 989214 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990726 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.526103973s of 13.826724052s, submitted: 4
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022c000 session 0x55a5637525a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64c00 session 0x55a5637523c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.582626343s of 49.590423584s, submitted: 1
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990135 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991647 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 992568 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.072285652s of 12.089330673s, submitted: 4
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56049b000 session 0x55a560b463c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1204.3 total, 600.0 interval#012Cumulative writes: 9060 writes, 35K keys, 9060 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 9060 writes, 1959 syncs, 4.62 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 781 writes, 1248 keys, 781 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s#012Interval WAL: 781 writes, 366 syncs, 2.13 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1204.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1204.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1204.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022d400 session 0x55a560ef6780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d65000 session 0x55a5601c9c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 95.210830688s of 95.219345093s, submitted: 2
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995001 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a560c23a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a560b46780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d56000 session 0x55a56370c000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64000 session 0x55a560223860
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83722240 unmapped: 122880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83722240 unmapped: 122880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 99.385841370s of 100.670951843s, submitted: 7
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83787776 unmapped: 57344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,1,0,1])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83861504 unmapped: 1032192 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83869696 unmapped: 1024000 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995535 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,3])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84934656 unmapped: 1007616 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84942848 unmapped: 999424 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995463 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 3.510344267s of 10.329211235s, submitted: 399
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994281 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d67400 session 0x55a5602190e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56049b000 session 0x55a562f18960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.042835236s of 40.052230835s, submitted: 3
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994149 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995661 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995661 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.938447952s of 13.945899963s, submitted: 2
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4293863724' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 120.149322510s of 120.153068542s, submitted: 1
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 999295 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 1671168 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 1622016 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 141 ms_handle_reset con 0x55a56022d800 session 0x55a560b46000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 1622016 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fc5e9000/0x0/0x4ffc00000, data 0x161cf8/0x221000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 141 ms_handle_reset con 0x55a56049b000 session 0x55a563595a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1144302 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb179000/0x0/0x4ffc00000, data 0x15d1cf8/0x1691000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a560b61400 session 0x55a563594000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a55fd2bc00 session 0x55a562f1fc20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56022d400 session 0x55a56387fa40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 30.783803940s of 31.204965591s, submitted: 49
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150286 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d65000 session 0x55a560e963c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d64c00 session 0x55a5627854a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150194 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb175000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56049b000 session 0x55a560a07a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a560b61400 session 0x55a560b472c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d64000 session 0x55a560e912c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.051367760s of 11.062813759s, submitted: 3
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d67400 session 0x55a560e96f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56022c400 session 0x55a563594b40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85614592 unmapped: 18161664 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56049b000 session 0x55a5635941e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85639168 unmapped: 18137088 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87097344 unmapped: 16678912 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a560b61400 session 0x55a562e5e960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d64000 session 0x55a56387e5a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d67400 session 0x55a5637521e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561cd6000 session 0x55a560219e00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1187745 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a56049b000 session 0x55a560e914a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87097344 unmapped: 16678912 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87138304 unmapped: 16637952 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d64000 session 0x55a56387fc20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1186624 data_alloc: 218103808 data_used: 303104
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87146496 unmapped: 16629760 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89415680 unmapped: 14360576 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89473024 unmapped: 14303232 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.645101547s of 10.825790405s, submitted: 53
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89497600 unmapped: 14278656 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89497600 unmapped: 14278656 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209371 data_alloc: 218103808 data_used: 3031040
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209371 data_alloc: 218103808 data_used: 3031040
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.822414398s of 10.054231644s, submitted: 18
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93847552 unmapped: 9928704 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93847552 unmapped: 9928704 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa929000/0x0/0x4ffc00000, data 0x1e13078/0x1edb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253507 data_alloc: 218103808 data_used: 3325952
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92520448 unmapped: 11255808 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92520448 unmapped: 11255808 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253507 data_alloc: 218103808 data_used: 3325952
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a56327b0e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3dc00 session 0x55a563595e00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a562f192c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92602368 unmapped: 11173888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.268079758s of 28.147586823s, submitted: 54
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a56021e780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef6d20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92594176 unmapped: 11182080 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a562785680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a55f9194a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3dc00 session 0x55a562e5fe00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562e5f2c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a5602214a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a560222f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93290496 unmapped: 20987904 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93290496 unmapped: 20987904 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93298688 unmapped: 20979712 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.620344162s of 17.155050278s, submitted: 29
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a560b99c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 94502912 unmapped: 19775488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100663296 unmapped: 13615104 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413079 data_alloc: 234881024 data_used: 13975552
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413079 data_alloc: 234881024 data_used: 13975552
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104775680 unmapped: 9502720 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.021077156s of 12.051360130s, submitted: 8
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1439531 data_alloc: 234881024 data_used: 14000128
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 9658368 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112254976 unmapped: 8896512 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4e000/0x0/0x4ffc00000, data 0x3655088/0x371e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1517819 data_alloc: 234881024 data_used: 14286848
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4c000/0x0/0x4ffc00000, data 0x3657088/0x3720000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112541696 unmapped: 8609792 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.255970955s of 10.148886681s, submitted: 105
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc1800 session 0x55a562e563c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a56370c780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1513215 data_alloc: 234881024 data_used: 14286848
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4c000/0x0/0x4ffc00000, data 0x3657088/0x3720000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103571456 unmapped: 17580032 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a563595680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267213 data_alloc: 218103808 data_used: 3334144
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f978f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267213 data_alloc: 218103808 data_used: 3334144
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.262123108s of 10.403896332s, submitted: 50
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560c245a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5602192c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103260160 unmapped: 17891328 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f978f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,1])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a56103c5a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560ef61e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562f1f680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184030 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184030 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.069961548s of 14.267497063s, submitted: 33
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184162 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:15.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184162 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a55fee6f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5601c9c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a5601c9680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.291867256s of 10.369996071s, submitted: 3
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a56327b860
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184631 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560a9fa40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a55fee7a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100130816 unmapped: 21020672 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a560219a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a5630701e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562e56780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a5630d8f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560ef6780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5630dad20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aef000/0x0/0x4ffc00000, data 0x1ab6ff3/0x1b7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54000 session 0x55a56103d4a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1226346 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560eef2c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560eefe00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99868672 unmapped: 29679616 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99893248 unmapped: 29655040 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262160 data_alloc: 218103808 data_used: 5316608
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262160 data_alloc: 218103808 data_used: 5316608
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101187584 unmapped: 28360704 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101187584 unmapped: 28360704 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.546638489s of 18.645618439s, submitted: 27
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102637568 unmapped: 26910720 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562e561e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560b990e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105865216 unmapped: 23683072 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300302 data_alloc: 218103808 data_used: 6565888
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105865216 unmapped: 23683072 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9635000/0x0/0x4ffc00000, data 0x1f6a003/0x2031000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,10])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 106921984 unmapped: 22626304 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 22519808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 22519808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310702 data_alloc: 218103808 data_used: 6471680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.670749664s of 11.247441292s, submitted: 64
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107118592 unmapped: 22429696 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107118592 unmapped: 22429696 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107151360 unmapped: 22396928 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107151360 unmapped: 22396928 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.062977791s of 12.069572449s, submitted: 1
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302567 data_alloc: 218103808 data_used: 6483968
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560eef0e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5603c8b40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 27033600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a562f1e5a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563070780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5630703c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a563070f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5630705a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.487091064s of 36.540157318s, submitted: 34
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560ef61e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef63c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a560ef7e00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c55800 session 0x55a560ef6960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560ef6000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1275342 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276816 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562c57c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101695488 unmapped: 31531008 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102670336 unmapped: 30556160 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1804.3 total, 600.0 interval#012Cumulative writes: 10K writes, 39K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 10K writes, 2684 syncs, 4.00 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1689 writes, 4700 keys, 1689 commit groups, 1.0 writes per commit group, ingest: 4.62 MB, 0.01 MB/s#012Interval WAL: 1689 writes, 725 syncs, 2.33 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350139 data_alloc: 234881024 data_used: 11247616
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350139 data_alloc: 234881024 data_used: 11247616
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350291 data_alloc: 234881024 data_used: 11251712
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.624202728s of 23.833007812s, submitted: 21
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112435200 unmapped: 20791296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112435200 unmapped: 20791296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111206400 unmapped: 22020096 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412875 data_alloc: 234881024 data_used: 11755520
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8cba000/0x0/0x4ffc00000, data 0x28ecfe3/0x29b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,1])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111697920 unmapped: 21528576 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113246208 unmapped: 19980288 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113278976 unmapped: 19947520 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113278976 unmapped: 19947520 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113319936 unmapped: 19906560 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1433231 data_alloc: 234881024 data_used: 12496896
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113319936 unmapped: 19906560 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1428407 data_alloc: 234881024 data_used: 12496896
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.531507492s of 14.888542175s, submitted: 102
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29a3fe3/0x2a69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29a3fe3/0x2a69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1428511 data_alloc: 234881024 data_used: 12496896
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29a4fe3/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29a4fe3/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120709120 unmapped: 12517376 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212800 session 0x55a560e90d20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560c22000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd9000 session 0x55a560b47860
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1484761 data_alloc: 234881024 data_used: 12496896
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560c24000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5630710e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560b463c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212800 session 0x55a562785a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214800 session 0x55a55fee70e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.321987152s of 11.643644333s, submitted: 14
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a563594960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f84b7000/0x0/0x4ffc00000, data 0x30effe3/0x31b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 19611648 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 19611648 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1503794 data_alloc: 234881024 data_used: 14667776
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 14409728 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532370 data_alloc: 234881024 data_used: 18919424
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 14409728 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532658 data_alloc: 234881024 data_used: 18923520
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.284761429s of 12.309606552s, submitted: 7
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122421248 unmapped: 10805248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122421248 unmapped: 10805248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121954304 unmapped: 11272192 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649774 data_alloc: 234881024 data_used: 19283968
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72cf000/0x0/0x4ffc00000, data 0x3ec7006/0x3f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72cf000/0x0/0x4ffc00000, data 0x3ec7006/0x3f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1644310 data_alloc: 234881024 data_used: 19283968
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a55fee7c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a56103c000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a5603c9a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d000 session 0x55a5601c8000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.608616829s of 10.889985085s, submitted: 103
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5601cbe00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87ee000/0x0/0x4ffc00000, data 0x29a7fe3/0x2a6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442224 data_alloc: 234881024 data_used: 12496896
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a563070960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55fee6f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a561048d20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217850 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.357207298s of 10.545221329s, submitted: 64
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1219494 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221006 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.268618584s of 12.279949188s, submitted: 3
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.385444641s of 16.389841080s, submitted: 1
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5623ae400 session 0x55a560ef7a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5635954a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5630714a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5630705a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5630703c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286318 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107503616 unmapped: 33071104 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107511808 unmapped: 33062912 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a560eeed20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107511808 unmapped: 33062912 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 106987520 unmapped: 33587200 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1288671 data_alloc: 218103808 data_used: 393216
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108208128 unmapped: 32366592 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347647 data_alloc: 218103808 data_used: 8814592
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a560eef4a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347647 data_alloc: 218103808 data_used: 8814592
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.396499634s of 20.359004974s, submitted: 48
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111820800 unmapped: 28753920 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115318784 unmapped: 25255936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c5f000/0x0/0x4ffc00000, data 0x2537045/0x25fd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 25124864 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c54000/0x0/0x4ffc00000, data 0x2541045/0x2607000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409157 data_alloc: 218103808 data_used: 9031680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bd3000/0x0/0x4ffc00000, data 0x25c3045/0x2689000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 27574272 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 27574272 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113139712 unmapped: 27435008 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 27426816 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bb2000/0x0/0x4ffc00000, data 0x25e4045/0x26aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408089 data_alloc: 218103808 data_used: 9035776
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 27418624 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 27418624 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bb2000/0x0/0x4ffc00000, data 0x25e4045/0x26aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.870034218s of 12.169149399s, submitted: 80
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113180672 unmapped: 27394048 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113180672 unmapped: 27394048 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411361 data_alloc: 218103808 data_used: 9035776
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411659 data_alloc: 218103808 data_used: 9043968
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 27279360 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412595 data_alloc: 218103808 data_used: 9043968
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.734275818s of 13.774451256s, submitted: 9
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55fee7c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a55fee6f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a561048d20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d62800 session 0x55a562c57c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114442240 unmapped: 26132480 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a562c574a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a560a070e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a563071a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a5603c9680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c55c00 session 0x55a560ef7c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 25993216 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x26ec06e/0x27b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432887 data_alloc: 218103808 data_used: 9048064
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 25985024 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 25960448 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x26ec0a7/0x27b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 25960448 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113704960 unmapped: 26869760 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a5603c9c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 26853376 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1434949 data_alloc: 218103808 data_used: 9048064
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d66000 session 0x55a560223a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x26ed0ca/0x27b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.734139442s of 11.903190613s, submitted: 57
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114343936 unmapped: 26230784 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114401280 unmapped: 26173440 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440168 data_alloc: 234881024 data_used: 9789440
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 25985024 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa3000/0x0/0x4ffc00000, data 0x26f10ca/0x27b9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [0,0,1])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114794496 unmapped: 25780224 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa3000/0x0/0x4ffc00000, data 0x26f10ca/0x27b9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440509 data_alloc: 234881024 data_used: 9789440
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116137984 unmapped: 24436736 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f840e000/0x0/0x4ffc00000, data 0x2d700ca/0x2e38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1504109 data_alloc: 234881024 data_used: 9850880
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a560c25680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef65a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117350400 unmapped: 23224320 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.155679703s of 12.892781258s, submitted: 488
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116834304 unmapped: 23740416 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116834304 unmapped: 23740416 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8402000/0x0/0x4ffc00000, data 0x2d920ca/0x2e5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1496421 data_alloc: 234881024 data_used: 9854976
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8402000/0x0/0x4ffc00000, data 0x2d920ca/0x2e5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116965376 unmapped: 23609344 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116965376 unmapped: 23609344 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f83f3000/0x0/0x4ffc00000, data 0x2da10ca/0x2e69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 23527424 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1496237 data_alloc: 234881024 data_used: 9854976
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 23527424 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.086823463s of 10.129245758s, submitted: 9
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a5601c8780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55f9181e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116473856 unmapped: 24100864 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560c22f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1426840 data_alloc: 218103808 data_used: 9109504
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1426840 data_alloc: 218103808 data_used: 9109504
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b84000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.182063103s of 12.550888062s, submitted: 72
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1424015 data_alloc: 218103808 data_used: 9109504
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5601ca780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560c24f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b84000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 29704192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a563071680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2bc00 session 0x55a5627841e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a562c56000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 30.355859756s of 30.541212082s, submitted: 59
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 29704192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245655 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110878720 unmapped: 29696000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110878720 unmapped: 29696000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562e57680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a56327a5a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a560a07a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc3800 session 0x55a5603c94a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560a072c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562785e00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a560ef6000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a56370cb40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc3800 session 0x55a560219860
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339283 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112795648 unmapped: 31449088 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560e914a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112803840 unmapped: 31440896 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 31277056 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [1])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1418779 data_alloc: 234881024 data_used: 12140544
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560b465a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a5635954a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.497446060s of 13.612625122s, submitted: 20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a56021fe00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 33759232 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 33759232 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.943146706s of 13.990984917s, submitted: 17
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022dc00 session 0x55a56387e780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562c570e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5602230e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a563752d20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a562f19e00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 33554432 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 33554432 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9819000/0x0/0x4ffc00000, data 0x197d045/0x1a43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1289281 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563d89000 session 0x55a560219a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a562e56f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a562f1f680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563752000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560c24000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a5603c9a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 32890880 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298244 data_alloc: 218103808 data_used: 1359872
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563070960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.371298790s of 11.489388466s, submitted: 37
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a56021fa40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110919680 unmapped: 33325056 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a56017a960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 33308672 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: mgrc ms_handle_reset ms_handle_reset con 0x55a561cb3c00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/4198923246
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/4198923246,v1:192.168.122.100:6801/4198923246]
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: mgrc handle_mgr_configure stats_period=5
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d68c00 session 0x55a562e5f680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56016f400 session 0x55a562ed23c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a55f6d90e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.596210480s of 16.945894241s, submitted: 37
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260315 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261695 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261695 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111124480 unmapped: 33120256 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111124480 unmapped: 33120256 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.821660995s of 13.956790924s, submitted: 3
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261563 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261563 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58400 session 0x55a5601ca3c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560b46780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a55f919860
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562c563c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a56370cd20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563d89800 session 0x55a5630dad20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303917 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96ee000/0x0/0x4ffc00000, data 0x1aa8045/0x1b6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 32940032 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560a9fa40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5601ca3c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.447840691s of 13.576416016s, submitted: 37
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111329280 unmapped: 32915456 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5601caf00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307891 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110780416 unmapped: 33464320 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342547 data_alloc: 218103808 data_used: 5341184
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342547 data_alloc: 218103808 data_used: 5341184
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111091712 unmapped: 33153024 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111099904 unmapped: 33144832 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.020989418s of 15.036432266s, submitted: 3
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112001024 unmapped: 32243712 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115662848 unmapped: 28581888 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.164536476s of 14.331671715s, submitted: 56
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a5601cba40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc2000 session 0x55a5601c81e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a560219e00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a56021f860
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a562ed21e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.203777313s of 22.330352783s, submitted: 42
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213c00 session 0x55a5601ca960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562215800 session 0x55a562ed2b40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a55fee70e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560220000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a562e5f2c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9689000/0x0/0x4ffc00000, data 0x1b0dfe3/0x1bd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343500 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202800 session 0x55a562784780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111263744 unmapped: 32980992 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343804 data_alloc: 218103808 data_used: 339968
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111198208 unmapped: 33046528 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399892 data_alloc: 218103808 data_used: 8769536
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.868194580s of 11.951797485s, submitted: 14
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399760 data_alloc: 218103808 data_used: 8769536
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 28647424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 22994944 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 22970368 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d6bc00 session 0x55a563071680
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a563070960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560c881e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a560c883c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 119250944 unmapped: 24993792 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202800 session 0x55a5601ca780
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56800 session 0x55a560ef6960
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a561048d20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a5610485a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2b000 session 0x55a5610492c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1586678 data_alloc: 234881024 data_used: 9949184
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 31072256 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213800 session 0x55a55fee6f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.895648956s of 14.161753654s, submitted: 94
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56257a400 session 0x55a55fee7a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1586694 data_alloc: 234881024 data_used: 9949184
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2b000 session 0x55a55fee6d20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a55fee7c20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 27189248 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1666876 data_alloc: 234881024 data_used: 21913600
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1666876 data_alloc: 234881024 data_used: 21913600
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.947762489s of 13.955580711s, submitted: 2
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134848512 unmapped: 17276928 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f777a000/0x0/0x4ffc00000, data 0x360bff3/0x36d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6b8d000/0x0/0x4ffc00000, data 0x41f0ff3/0x42b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759978 data_alloc: 234881024 data_used: 22151168
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134922240 unmapped: 17203200 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6af2000/0x0/0x4ffc00000, data 0x4293ff3/0x435a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134946816 unmapped: 17178624 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1771750 data_alloc: 234881024 data_used: 22212608
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134946816 unmapped: 17178624 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135184384 unmapped: 16941056 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ace000/0x0/0x4ffc00000, data 0x42b7ff3/0x437e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135217152 unmapped: 16908288 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135217152 unmapped: 16908288 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135225344 unmapped: 16900096 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1770406 data_alloc: 234881024 data_used: 22220800
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135225344 unmapped: 16900096 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ace000/0x0/0x4ffc00000, data 0x42b7ff3/0x437e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135258112 unmapped: 16867328 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135258112 unmapped: 16867328 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.216358185s of 14.454858780s, submitted: 91
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135315456 unmapped: 16809984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135315456 unmapped: 16809984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1770158 data_alloc: 234881024 data_used: 22220800
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a5603c92c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213800 session 0x55a560ef7a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135340032 unmapped: 16785408 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135340032 unmapped: 16785408 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1512596 data_alloc: 234881024 data_used: 9957376
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8392000/0x0/0x4ffc00000, data 0x29f3ff3/0x2aba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55f6c5400 session 0x55a561048b40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255c400 session 0x55a560ef72c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563610800 session 0x55a560c24f00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.984561920s of 24.057754517s, submitted: 24
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255cc00 session 0x55a560219860
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.494945526s of 40.619098663s, submitted: 20
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120668160 unmapped: 31457280 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120283136 unmapped: 31842304 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214c00 session 0x55a560ef7860
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5631e0c00 session 0x55a562c565a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214c00 session 0x55a560218000
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1358694 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255c400 session 0x55a560219e00
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255cc00 session 0x55a560ef65a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357294 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.344947815s of 10.433979988s, submitted: 23
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5631e0c00 session 0x55a563071a40
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359231 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120242176 unmapped: 31883264 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408783 data_alloc: 218103808 data_used: 7593984
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408783 data_alloc: 218103808 data_used: 7593984
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.895034790s of 14.912478447s, submitted: 5
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123174912 unmapped: 28950528 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1462949 data_alloc: 218103808 data_used: 7610368
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f895c000/0x0/0x4ffc00000, data 0x242a006/0x24f0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1467605 data_alloc: 218103808 data_used: 7610368
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8863000/0x0/0x4ffc00000, data 0x2523006/0x25e9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.162461281s of 11.450368881s, submitted: 61
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126320640 unmapped: 25804800 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1473525 data_alloc: 218103808 data_used: 7856128
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x25a0006/0x2666000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x25a0006/0x2666000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1476015 data_alloc: 218103808 data_used: 7860224
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87c4000/0x0/0x4ffc00000, data 0x25c2006/0x2688000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.021368027s of 10.088058472s, submitted: 14
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202c00 session 0x55a5602214a0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c53400 session 0x55a560a9f0e0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 28925952 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308597 data_alloc: 218103808 data_used: 311296
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 28925952 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562215c00 session 0x55a5630703c0
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2404.3 total, 600.0 interval#012Cumulative writes: 13K writes, 49K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 13K writes, 4008 syncs, 3.45 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3081 writes, 10K keys, 3081 commit groups, 1.0 writes per commit group, ingest: 9.92 MB, 0.02 MB/s#012Interval WAL: 3081 writes, 1324 syncs, 2.33 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122044416 unmapped: 30081024 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}'
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: do_command 'config show' '{prefix=config show}'
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122052608 unmapped: 30072832 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 30474240 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 30359552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:30:15 np0005593294 ceph-osd[77616]: do_command 'log dump' '{prefix=log dump}'
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2952901814' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4201206189' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1559929094' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:30:15 np0005593294 nova_compute[225705]: 2026-01-23 10:30:15.562 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.581s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:15.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:15 np0005593294 nova_compute[225705]: 2026-01-23 10:30:15.761 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:30:15 np0005593294 nova_compute[225705]: 2026-01-23 10:30:15.762 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4764MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:30:15 np0005593294 nova_compute[225705]: 2026-01-23 10:30:15.762 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:15 np0005593294 nova_compute[225705]: 2026-01-23 10:30:15.763 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:15 np0005593294 nova_compute[225705]: 2026-01-23 10:30:15.870 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:30:15 np0005593294 nova_compute[225705]: 2026-01-23 10:30:15.870 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:30:15 np0005593294 nova_compute[225705]: 2026-01-23 10:30:15.889 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 05:30:15 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3107004277' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 05:30:16 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:30:16 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4004933432' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:30:16 np0005593294 nova_compute[225705]: 2026-01-23 10:30:16.402 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:16 np0005593294 nova_compute[225705]: 2026-01-23 10:30:16.408 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:30:16 np0005593294 nova_compute[225705]: 2026-01-23 10:30:16.425 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:30:16 np0005593294 nova_compute[225705]: 2026-01-23 10:30:16.427 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:30:16 np0005593294 nova_compute[225705]: 2026-01-23 10:30:16.427 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:16 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 05:30:16 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1506185174' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 05:30:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:17.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:17 np0005593294 nova_compute[225705]: 2026-01-23 10:30:17.087 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 23 05:30:17 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/873021322' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 05:30:17 np0005593294 nova_compute[225705]: 2026-01-23 10:30:17.428 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:17 np0005593294 nova_compute[225705]: 2026-01-23 10:30:17.428 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:17 np0005593294 nova_compute[225705]: 2026-01-23 10:30:17.429 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:17 np0005593294 nova_compute[225705]: 2026-01-23 10:30:17.429 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:17 np0005593294 nova_compute[225705]: 2026-01-23 10:30:17.429 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:30:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:17.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 23 05:30:17 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/753443402' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 05:30:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 23 05:30:18 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2692355636' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 05:30:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 23 05:30:18 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2848004950' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 05:30:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:18 np0005593294 nova_compute[225705]: 2026-01-23 10:30:18.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 23 05:30:18 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3799754458' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 05:30:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 23 05:30:19 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1425965098' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 05:30:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:19.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 23 05:30:19 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/641995280' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 05:30:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:19.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:19 np0005593294 podman[241574]: 2026-01-23 10:30:19.709936464 +0000 UTC m=+0.107902250 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 23 05:30:19 np0005593294 systemd[1]: Starting Hostname Service...
Jan 23 05:30:19 np0005593294 nova_compute[225705]: 2026-01-23 10:30:19.863 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 23 05:30:19 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1463104035' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 05:30:19 np0005593294 systemd[1]: Started Hostname Service.
Jan 23 05:30:20 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 23 05:30:20 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/331196613' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 05:30:20 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 23 05:30:20 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1013832290' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 05:30:20 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 23 05:30:20 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/758305960' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 05:30:20 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 23 05:30:20 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/15662595' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 05:30:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:21.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:21 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 23 05:30:21 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2098994895' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 05:30:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:21.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:22 np0005593294 nova_compute[225705]: 2026-01-23 10:30:22.088 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:22 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 23 05:30:22 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1090320711' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 05:30:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:23.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 23 05:30:23 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3613864487' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 05:30:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:23.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 05:30:23 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1692553818' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 05:30:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:24 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 23 05:30:24 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3854792418' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 05:30:24 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 23 05:30:24 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3166484618' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 05:30:24 np0005593294 nova_compute[225705]: 2026-01-23 10:30:24.866 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:25.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:25 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 23 05:30:25 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1855905813' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 05:30:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:25.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:26 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:30:26 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:30:26 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:30:26 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:30:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:27 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 05:30:27 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 05:30:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:27.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:27 np0005593294 nova_compute[225705]: 2026-01-23 10:30:27.091 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:27 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 05:30:27 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 05:30:27 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 23 05:30:27 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/931928461' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 05:30:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:27.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 23 05:30:28 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/292578968' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 05:30:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 23 05:30:28 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3039266935' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 05:30:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 23 05:30:28 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/584663929' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 05:30:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:29.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:29 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Jan 23 05:30:29 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2920006099' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 05:30:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:29.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:29 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Jan 23 05:30:29 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1049295026' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 05:30:29 np0005593294 nova_compute[225705]: 2026-01-23 10:30:29.868 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:30 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Jan 23 05:30:30 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1786797257' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 05:30:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:31.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:31.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:32 np0005593294 nova_compute[225705]: 2026-01-23 10:30:32.093 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:32 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Jan 23 05:30:32 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1709192928' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 05:30:33 np0005593294 ovs-appctl[243650]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 05:30:33 np0005593294 ovs-appctl[243655]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 05:30:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:33.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:33 np0005593294 ovs-appctl[243663]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 05:30:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Jan 23 05:30:33 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/368403482' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 05:30:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:33.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Jan 23 05:30:34 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2586594643' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 05:30:34 np0005593294 nova_compute[225705]: 2026-01-23 10:30:34.907 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:35 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Jan 23 05:30:35 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2023447290' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 05:30:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:35.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:35.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:35 np0005593294 podman[244677]: 2026-01-23 10:30:35.667012025 +0000 UTC m=+0.068533442 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:30:36 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:30:36 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:30:36 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Jan 23 05:30:36 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/514541159' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 05:30:36 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 05:30:36 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 05:30:36 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Jan 23 05:30:36 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2493126188' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 05:30:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:37.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:37 np0005593294 nova_compute[225705]: 2026-01-23 10:30:37.150 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:37.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:37 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 05:30:37 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1176070681' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 05:30:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Jan 23 05:30:38 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/571241287' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 05:30:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Jan 23 05:30:38 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2918126809' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 05:30:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:39.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:39 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 05:30:39 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1796333743' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 05:30:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:39.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:39 np0005593294 nova_compute[225705]: 2026-01-23 10:30:39.910 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:40 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Jan 23 05:30:40 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1655420808' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 05:30:40 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Jan 23 05:30:40 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2704307379' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 05:30:41 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Jan 23 05:30:41 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2288402487' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 05:30:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:41.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:41.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:41 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Jan 23 05:30:41 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/560283955' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 05:30:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:42 np0005593294 nova_compute[225705]: 2026-01-23 10:30:42.151 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:42 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Jan 23 05:30:42 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3879520383' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 05:30:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:43.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Jan 23 05:30:43 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3587918853' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 05:30:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:43.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:44 np0005593294 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 05:30:44 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Jan 23 05:30:44 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1756811035' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 05:30:44 np0005593294 nova_compute[225705]: 2026-01-23 10:30:44.959 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:45 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Jan 23 05:30:45 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/634711209' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 05:30:45 np0005593294 systemd[1]: Starting Time & Date Service...
Jan 23 05:30:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:45.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:45 np0005593294 systemd[1]: Started Time & Date Service.
Jan 23 05:30:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:45.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:46 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:46 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:47 np0005593294 nova_compute[225705]: 2026-01-23 10:30:47.153 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:47.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1725430918' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6530 writes, 35K keys, 6530 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 6530 writes, 6530 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1542 writes, 7888 keys, 1542 commit groups, 1.0 writes per commit group, ingest: 17.81 MB, 0.03 MB/s#012Interval WAL: 1542 writes, 1542 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     61.0      0.80              0.15        18    0.045       0      0       0.0       0.0#012  L6      1/0   13.93 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.4    118.4    101.7      2.09              0.60        17    0.123     94K   9317       0.0       0.0#012 Sum      1/0   13.93 MB   0.0      0.2     0.0      0.2       0.3      0.1       0.0   5.4     85.6     90.4      2.90              0.75        35    0.083     94K   9317       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.6     96.0     98.7      0.64              0.18         8    0.081     26K   2540       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    118.4    101.7      2.09              0.60        17    0.123     94K   9317       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     61.1      0.80              0.15        17    0.047       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.048, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.26 GB write, 0.11 MB/s write, 0.24 GB read, 0.10 MB/s read, 2.9 seconds#012Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 22.75 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000178 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1398,22.02 MB,7.24434%) FilterBlock(35,274.42 KB,0.0881546%) IndexBlock(35,473.86 KB,0.152221%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3995885793' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 05:30:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:47.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1993038481' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:47 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:30:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Jan 23 05:30:48 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3762032723' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 05:30:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:49.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:49 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Jan 23 05:30:49 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/736282130' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 05:30:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:49.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:49 np0005593294 nova_compute[225705]: 2026-01-23 10:30:49.962 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:50 np0005593294 podman[246382]: 2026-01-23 10:30:50.712785068 +0000 UTC m=+0.109211701 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 05:30:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:51.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:51.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:52 np0005593294 nova_compute[225705]: 2026-01-23 10:30:52.158 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:53.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:53.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:54 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:54 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:54 np0005593294 nova_compute[225705]: 2026-01-23 10:30:54.965 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:30:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:30:55.061 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:30:55.062 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:55.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:55.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:30:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:30:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:30:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:30:57 np0005593294 nova_compute[225705]: 2026-01-23 10:30:57.161 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:57.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:57.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:59.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:30:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:59.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:00 np0005593294 nova_compute[225705]: 2026-01-23 10:31:00.023 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:01.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:01.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:02 np0005593294 nova_compute[225705]: 2026-01-23 10:31:02.163 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:03.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:03.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:05 np0005593294 nova_compute[225705]: 2026-01-23 10:31:05.027 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:05.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:05.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:06 np0005593294 podman[246446]: 2026-01-23 10:31:06.663504745 +0000 UTC m=+0.066625095 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:31:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:07 np0005593294 nova_compute[225705]: 2026-01-23 10:31:07.162 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:07.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:07.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:09.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:09.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:10 np0005593294 nova_compute[225705]: 2026-01-23 10:31:10.031 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:11.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:11.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:11 np0005593294 nova_compute[225705]: 2026-01-23 10:31:11.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:11 np0005593294 nova_compute[225705]: 2026-01-23 10:31:11.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:31:11 np0005593294 nova_compute[225705]: 2026-01-23 10:31:11.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:31:11 np0005593294 nova_compute[225705]: 2026-01-23 10:31:11.892 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:31:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:12 np0005593294 nova_compute[225705]: 2026-01-23 10:31:12.164 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:12 np0005593294 nova_compute[225705]: 2026-01-23 10:31:12.886 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:13.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:13.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:13 np0005593294 nova_compute[225705]: 2026-01-23 10:31:13.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:14 np0005593294 nova_compute[225705]: 2026-01-23 10:31:14.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:14 np0005593294 nova_compute[225705]: 2026-01-23 10:31:14.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:31:14 np0005593294 nova_compute[225705]: 2026-01-23 10:31:14.913 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:31:14 np0005593294 nova_compute[225705]: 2026-01-23 10:31:14.914 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:14 np0005593294 nova_compute[225705]: 2026-01-23 10:31:14.915 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:31:15 np0005593294 nova_compute[225705]: 2026-01-23 10:31:15.071 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:15.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:15 np0005593294 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 05:31:15 np0005593294 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 05:31:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:15.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:15 np0005593294 nova_compute[225705]: 2026-01-23 10:31:15.930 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:15 np0005593294 nova_compute[225705]: 2026-01-23 10:31:15.957 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:15 np0005593294 nova_compute[225705]: 2026-01-23 10:31:15.958 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:15 np0005593294 nova_compute[225705]: 2026-01-23 10:31:15.958 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:15 np0005593294 nova_compute[225705]: 2026-01-23 10:31:15.958 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:31:15 np0005593294 nova_compute[225705]: 2026-01-23 10:31:15.959 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:16 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:31:16 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4255286849' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:16 np0005593294 nova_compute[225705]: 2026-01-23 10:31:16.419 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:16 np0005593294 nova_compute[225705]: 2026-01-23 10:31:16.611 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:31:16 np0005593294 nova_compute[225705]: 2026-01-23 10:31:16.612 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4651MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:31:16 np0005593294 nova_compute[225705]: 2026-01-23 10:31:16.612 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:16 np0005593294 nova_compute[225705]: 2026-01-23 10:31:16.613 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:16 np0005593294 nova_compute[225705]: 2026-01-23 10:31:16.692 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:31:16 np0005593294 nova_compute[225705]: 2026-01-23 10:31:16.692 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:31:16 np0005593294 nova_compute[225705]: 2026-01-23 10:31:16.819 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.167 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:17.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:31:17 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2961280680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.307 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.315 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.337 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.342 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.343 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:17.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:17 np0005593294 nova_compute[225705]: 2026-01-23 10:31:17.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:18 np0005593294 nova_compute[225705]: 2026-01-23 10:31:18.882 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:18 np0005593294 nova_compute[225705]: 2026-01-23 10:31:18.907 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:18 np0005593294 nova_compute[225705]: 2026-01-23 10:31:18.908 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:31:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:19.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:19.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:20 np0005593294 nova_compute[225705]: 2026-01-23 10:31:20.074 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:20 np0005593294 podman[246545]: 2026-01-23 10:31:20.864859389 +0000 UTC m=+0.099222794 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:31:20 np0005593294 nova_compute[225705]: 2026-01-23 10:31:20.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:21.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:21.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:21 np0005593294 nova_compute[225705]: 2026-01-23 10:31:21.775 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:22 np0005593294 nova_compute[225705]: 2026-01-23 10:31:22.172 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:23.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:23.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:25 np0005593294 nova_compute[225705]: 2026-01-23 10:31:25.077 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:25.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:25.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:27 np0005593294 nova_compute[225705]: 2026-01-23 10:31:27.173 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:27.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:27.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:29.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:29.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:30 np0005593294 nova_compute[225705]: 2026-01-23 10:31:30.081 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:31.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:31.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:32 np0005593294 nova_compute[225705]: 2026-01-23 10:31:32.175 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:33.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:33.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:35 np0005593294 nova_compute[225705]: 2026-01-23 10:31:35.111 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:35.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:35.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:37 np0005593294 nova_compute[225705]: 2026-01-23 10:31:37.176 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:37.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:37 np0005593294 podman[246605]: 2026-01-23 10:31:37.68659216 +0000 UTC m=+0.085809713 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:31:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:37.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:39.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:39.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:39 np0005593294 systemd[1]: session-55.scope: Deactivated successfully.
Jan 23 05:31:39 np0005593294 systemd[1]: session-55.scope: Consumed 2min 58.763s CPU time, 719.0M memory peak, read 257.5M from disk, written 64.5M to disk.
Jan 23 05:31:39 np0005593294 systemd-logind[807]: Session 55 logged out. Waiting for processes to exit.
Jan 23 05:31:39 np0005593294 systemd-logind[807]: Removed session 55.
Jan 23 05:31:40 np0005593294 systemd-logind[807]: New session 56 of user zuul.
Jan 23 05:31:40 np0005593294 systemd[1]: Started Session 56 of User zuul.
Jan 23 05:31:40 np0005593294 nova_compute[225705]: 2026-01-23 10:31:40.114 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:40 np0005593294 systemd[1]: session-56.scope: Deactivated successfully.
Jan 23 05:31:40 np0005593294 systemd-logind[807]: Session 56 logged out. Waiting for processes to exit.
Jan 23 05:31:40 np0005593294 systemd-logind[807]: Removed session 56.
Jan 23 05:31:40 np0005593294 systemd-logind[807]: New session 57 of user zuul.
Jan 23 05:31:40 np0005593294 systemd[1]: Started Session 57 of User zuul.
Jan 23 05:31:40 np0005593294 systemd[1]: session-57.scope: Deactivated successfully.
Jan 23 05:31:40 np0005593294 systemd-logind[807]: Session 57 logged out. Waiting for processes to exit.
Jan 23 05:31:40 np0005593294 systemd-logind[807]: Removed session 57.
Jan 23 05:31:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:41.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:41.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:42 np0005593294 nova_compute[225705]: 2026-01-23 10:31:42.178 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:43.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:43.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:45 np0005593294 nova_compute[225705]: 2026-01-23 10:31:45.118 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:45.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:31:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:45.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:31:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:47 np0005593294 nova_compute[225705]: 2026-01-23 10:31:47.178 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:47.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:47.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:49.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:49.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:50 np0005593294 nova_compute[225705]: 2026-01-23 10:31:50.122 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:51.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:51 np0005593294 podman[246715]: 2026-01-23 10:31:51.741918802 +0000 UTC m=+0.135660127 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:31:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:52 np0005593294 nova_compute[225705]: 2026-01-23 10:31:52.180 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:53.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:53.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:54 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:31:54 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:31:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:31:55.063 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:31:55.063 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:31:55.064 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:55 np0005593294 nova_compute[225705]: 2026-01-23 10:31:55.125 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:55 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:31:55 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:31:55 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:31:55 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:31:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:55.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:55.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:31:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:31:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:31:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:31:57 np0005593294 nova_compute[225705]: 2026-01-23 10:31:57.181 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:31:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:57.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:31:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:57.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:59.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:31:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:59.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:00 np0005593294 nova_compute[225705]: 2026-01-23 10:32:00.129 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:00 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:32:00 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:32:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:01.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:01.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:02 np0005593294 nova_compute[225705]: 2026-01-23 10:32:02.185 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.341299) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322341426, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2851, "num_deletes": 506, "total_data_size": 6398009, "memory_usage": 6490344, "flush_reason": "Manual Compaction"}
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322360132, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2686587, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33518, "largest_seqno": 36364, "table_properties": {"data_size": 2676761, "index_size": 5040, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3717, "raw_key_size": 31915, "raw_average_key_size": 21, "raw_value_size": 2652031, "raw_average_value_size": 1807, "num_data_blocks": 215, "num_entries": 1467, "num_filter_entries": 1467, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164133, "oldest_key_time": 1769164133, "file_creation_time": 1769164322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 18892 microseconds, and 7774 cpu microseconds.
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.360207) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2686587 bytes OK
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.360268) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.363096) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.363130) EVENT_LOG_v1 {"time_micros": 1769164322363126, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.363150) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6383594, prev total WAL file size 6383594, number of live WAL files 2.
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.365049) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303034' seq:72057594037927935, type:22 .. '6D6772737461740031323536' seq:0, type:0; will stop at (end)
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(2623KB)], [63(13MB)]
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322365244, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 17294163, "oldest_snapshot_seqno": -1}
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6859 keys, 14416841 bytes, temperature: kUnknown
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322499623, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 14416841, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14372547, "index_size": 26070, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 177005, "raw_average_key_size": 25, "raw_value_size": 14250687, "raw_average_value_size": 2077, "num_data_blocks": 1044, "num_entries": 6859, "num_filter_entries": 6859, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.500211) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 14416841 bytes
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.503100) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 128.6 rd, 107.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 13.9 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(11.8) write-amplify(5.4) OK, records in: 7803, records dropped: 944 output_compression: NoCompression
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.503132) EVENT_LOG_v1 {"time_micros": 1769164322503118, "job": 38, "event": "compaction_finished", "compaction_time_micros": 134514, "compaction_time_cpu_micros": 59576, "output_level": 6, "num_output_files": 1, "total_output_size": 14416841, "num_input_records": 7803, "num_output_records": 6859, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322504332, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322509479, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.364801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509610) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509623) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509626) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509631) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:03.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:03.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:05 np0005593294 nova_compute[225705]: 2026-01-23 10:32:05.173 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:05.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:05.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:07 np0005593294 nova_compute[225705]: 2026-01-23 10:32:07.187 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:07.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:07.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:08 np0005593294 podman[246881]: 2026-01-23 10:32:08.081375111 +0000 UTC m=+0.074633937 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:32:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:09.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:09.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:10 np0005593294 nova_compute[225705]: 2026-01-23 10:32:10.177 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:11.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:11.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:12 np0005593294 nova_compute[225705]: 2026-01-23 10:32:12.045 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:12 np0005593294 nova_compute[225705]: 2026-01-23 10:32:12.045 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:32:12 np0005593294 nova_compute[225705]: 2026-01-23 10:32:12.045 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:32:12 np0005593294 nova_compute[225705]: 2026-01-23 10:32:12.098 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:32:12 np0005593294 nova_compute[225705]: 2026-01-23 10:32:12.188 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:13.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:13.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:13 np0005593294 nova_compute[225705]: 2026-01-23 10:32:13.923 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:15 np0005593294 nova_compute[225705]: 2026-01-23 10:32:15.195 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:15.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:15.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:15 np0005593294 nova_compute[225705]: 2026-01-23 10:32:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:16 np0005593294 nova_compute[225705]: 2026-01-23 10:32:16.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:16 np0005593294 nova_compute[225705]: 2026-01-23 10:32:16.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:16 np0005593294 nova_compute[225705]: 2026-01-23 10:32:16.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:16 np0005593294 nova_compute[225705]: 2026-01-23 10:32:16.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:16 np0005593294 nova_compute[225705]: 2026-01-23 10:32:16.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:16 np0005593294 nova_compute[225705]: 2026-01-23 10:32:16.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:16 np0005593294 nova_compute[225705]: 2026-01-23 10:32:16.902 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:32:16 np0005593294 nova_compute[225705]: 2026-01-23 10:32:16.902 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:17 np0005593294 nova_compute[225705]: 2026-01-23 10:32:17.191 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:17.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1685129570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:32:17 np0005593294 nova_compute[225705]: 2026-01-23 10:32:17.434 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:17 np0005593294 nova_compute[225705]: 2026-01-23 10:32:17.604 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:32:17 np0005593294 nova_compute[225705]: 2026-01-23 10:32:17.607 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4837MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:32:17 np0005593294 nova_compute[225705]: 2026-01-23 10:32:17.607 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:17 np0005593294 nova_compute[225705]: 2026-01-23 10:32:17.608 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:17.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.887292) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337887332, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 400, "num_deletes": 251, "total_data_size": 456611, "memory_usage": 464888, "flush_reason": "Manual Compaction"}
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337891573, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 298037, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36369, "largest_seqno": 36764, "table_properties": {"data_size": 295738, "index_size": 463, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5740, "raw_average_key_size": 18, "raw_value_size": 291160, "raw_average_value_size": 948, "num_data_blocks": 20, "num_entries": 307, "num_filter_entries": 307, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164323, "oldest_key_time": 1769164323, "file_creation_time": 1769164337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 4328 microseconds, and 1505 cpu microseconds.
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.891622) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 298037 bytes OK
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.891644) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893595) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893612) EVENT_LOG_v1 {"time_micros": 1769164337893607, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893634) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 454028, prev total WAL file size 454028, number of live WAL files 2.
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.894054) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(291KB)], [66(13MB)]
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337894086, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 14714878, "oldest_snapshot_seqno": -1}
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6656 keys, 12554193 bytes, temperature: kUnknown
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337969059, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 12554193, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12512675, "index_size": 23806, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16645, "raw_key_size": 173491, "raw_average_key_size": 26, "raw_value_size": 12395764, "raw_average_value_size": 1862, "num_data_blocks": 943, "num_entries": 6656, "num_filter_entries": 6656, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.969457) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 12554193 bytes
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.971213) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.9 rd, 167.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 13.7 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(91.5) write-amplify(42.1) OK, records in: 7166, records dropped: 510 output_compression: NoCompression
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.971233) EVENT_LOG_v1 {"time_micros": 1769164337971223, "job": 40, "event": "compaction_finished", "compaction_time_micros": 75126, "compaction_time_cpu_micros": 28493, "output_level": 6, "num_output_files": 1, "total_output_size": 12554193, "num_input_records": 7166, "num_output_records": 6656, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337972165, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337974850, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974964) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:18 np0005593294 nova_compute[225705]: 2026-01-23 10:32:18.109 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:32:18 np0005593294 nova_compute[225705]: 2026-01-23 10:32:18.110 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:32:18 np0005593294 nova_compute[225705]: 2026-01-23 10:32:18.241 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:32:18 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/204262838' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:32:18 np0005593294 nova_compute[225705]: 2026-01-23 10:32:18.721 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:18 np0005593294 nova_compute[225705]: 2026-01-23 10:32:18.727 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:32:18 np0005593294 nova_compute[225705]: 2026-01-23 10:32:18.763 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:32:18 np0005593294 nova_compute[225705]: 2026-01-23 10:32:18.765 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:32:18 np0005593294 nova_compute[225705]: 2026-01-23 10:32:18.765 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:19.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:19.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:19 np0005593294 nova_compute[225705]: 2026-01-23 10:32:19.766 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:19 np0005593294 nova_compute[225705]: 2026-01-23 10:32:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:19 np0005593294 nova_compute[225705]: 2026-01-23 10:32:19.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:32:20 np0005593294 nova_compute[225705]: 2026-01-23 10:32:20.235 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:20 np0005593294 nova_compute[225705]: 2026-01-23 10:32:20.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:21.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:21.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:22 np0005593294 nova_compute[225705]: 2026-01-23 10:32:22.194 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:22 np0005593294 podman[246954]: 2026-01-23 10:32:22.70365427 +0000 UTC m=+0.104362652 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 05:32:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:23.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:23.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:25 np0005593294 nova_compute[225705]: 2026-01-23 10:32:25.238 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:25.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:25.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:27 np0005593294 nova_compute[225705]: 2026-01-23 10:32:27.196 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:27.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:27.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:29.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:29.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:30 np0005593294 nova_compute[225705]: 2026-01-23 10:32:30.242 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:31.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:31.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:32 np0005593294 nova_compute[225705]: 2026-01-23 10:32:32.199 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:33.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:33.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:35 np0005593294 nova_compute[225705]: 2026-01-23 10:32:35.246 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:35.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:35 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 05:32:35 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 05:32:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:35.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:36 np0005593294 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 05:32:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:37 np0005593294 nova_compute[225705]: 2026-01-23 10:32:37.201 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:37.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:37.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:38 np0005593294 podman[247013]: 2026-01-23 10:32:38.702225413 +0000 UTC m=+0.090974152 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 05:32:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:39.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:39.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:40 np0005593294 nova_compute[225705]: 2026-01-23 10:32:40.276 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:41.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:41.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:42 np0005593294 nova_compute[225705]: 2026-01-23 10:32:42.203 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:43.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:43.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:45 np0005593294 nova_compute[225705]: 2026-01-23 10:32:45.280 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:45.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:45.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:47 np0005593294 nova_compute[225705]: 2026-01-23 10:32:47.206 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:47.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:47.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:49.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:49.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:50 np0005593294 nova_compute[225705]: 2026-01-23 10:32:50.283 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:51.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:51.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:52 np0005593294 nova_compute[225705]: 2026-01-23 10:32:52.208 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:53.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:53 np0005593294 podman[247065]: 2026-01-23 10:32:53.730297805 +0000 UTC m=+0.132016661 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 05:32:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:53.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:32:55.064 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:32:55.065 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:32:55.065 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:55 np0005593294 nova_compute[225705]: 2026-01-23 10:32:55.286 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:55.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:55.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:32:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:32:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:32:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:32:57 np0005593294 nova_compute[225705]: 2026-01-23 10:32:57.209 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:57.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:59.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:32:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:59.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:00 np0005593294 nova_compute[225705]: 2026-01-23 10:33:00.290 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:33:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:33:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:33:01 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:33:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:01.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:01.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:02 np0005593294 nova_compute[225705]: 2026-01-23 10:33:02.211 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:03.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:03.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:05 np0005593294 nova_compute[225705]: 2026-01-23 10:33:05.293 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:05.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:05.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:33:05 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:33:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:07 np0005593294 nova_compute[225705]: 2026-01-23 10:33:07.213 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:07.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.464205) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387464256, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 743, "num_deletes": 251, "total_data_size": 1499624, "memory_usage": 1527584, "flush_reason": "Manual Compaction"}
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387474950, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 980518, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36769, "largest_seqno": 37507, "table_properties": {"data_size": 976921, "index_size": 1441, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7221, "raw_average_key_size": 17, "raw_value_size": 969770, "raw_average_value_size": 2298, "num_data_blocks": 62, "num_entries": 422, "num_filter_entries": 422, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164338, "oldest_key_time": 1769164338, "file_creation_time": 1769164387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 10797 microseconds, and 6348 cpu microseconds.
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.475002) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 980518 bytes OK
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.475022) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.477030) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.477083) EVENT_LOG_v1 {"time_micros": 1769164387477069, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.477117) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 1495708, prev total WAL file size 1495708, number of live WAL files 2.
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.478365) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323531' seq:72057594037927935, type:22 .. '6B7600353033' seq:0, type:0; will stop at (end)
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(957KB)], [69(11MB)]
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387478426, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 13534711, "oldest_snapshot_seqno": -1}
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6562 keys, 12131851 bytes, temperature: kUnknown
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387562779, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12131851, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12090948, "index_size": 23383, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16453, "raw_key_size": 173211, "raw_average_key_size": 26, "raw_value_size": 11975454, "raw_average_value_size": 1824, "num_data_blocks": 913, "num_entries": 6562, "num_filter_entries": 6562, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.563317) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12131851 bytes
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.564748) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.3 rd, 143.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.0 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(26.2) write-amplify(12.4) OK, records in: 7078, records dropped: 516 output_compression: NoCompression
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.564772) EVENT_LOG_v1 {"time_micros": 1769164387564761, "job": 42, "event": "compaction_finished", "compaction_time_micros": 84428, "compaction_time_cpu_micros": 34573, "output_level": 6, "num_output_files": 1, "total_output_size": 12131851, "num_input_records": 7078, "num_output_records": 6562, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387565306, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387568605, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.478235) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568685) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568693) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568695) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568697) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:07.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:09.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:09 np0005593294 podman[247231]: 2026-01-23 10:33:09.682078877 +0000 UTC m=+0.082104693 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:33:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:09.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:10 np0005593294 nova_compute[225705]: 2026-01-23 10:33:10.297 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:11.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:11.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:12 np0005593294 nova_compute[225705]: 2026-01-23 10:33:12.214 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:13.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:13.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:13 np0005593294 nova_compute[225705]: 2026-01-23 10:33:13.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:13 np0005593294 nova_compute[225705]: 2026-01-23 10:33:13.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:33:13 np0005593294 nova_compute[225705]: 2026-01-23 10:33:13.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:33:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:13 np0005593294 nova_compute[225705]: 2026-01-23 10:33:13.890 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:33:14 np0005593294 nova_compute[225705]: 2026-01-23 10:33:14.884 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:15 np0005593294 nova_compute[225705]: 2026-01-23 10:33:15.324 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:33:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:15.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:33:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:15.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:15 np0005593294 nova_compute[225705]: 2026-01-23 10:33:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:17 np0005593294 nova_compute[225705]: 2026-01-23 10:33:17.215 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:17.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:17.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:17 np0005593294 nova_compute[225705]: 2026-01-23 10:33:17.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:17 np0005593294 nova_compute[225705]: 2026-01-23 10:33:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:17 np0005593294 nova_compute[225705]: 2026-01-23 10:33:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.030 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.030 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.031 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.031 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.031 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:33:18 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/944075048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.508 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.743 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.746 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4856MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.746 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.747 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.846 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.846 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:33:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:18 np0005593294 nova_compute[225705]: 2026-01-23 10:33:18.897 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:33:19 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2868338412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:19 np0005593294 nova_compute[225705]: 2026-01-23 10:33:19.377 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:19 np0005593294 nova_compute[225705]: 2026-01-23 10:33:19.384 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:33:19 np0005593294 nova_compute[225705]: 2026-01-23 10:33:19.409 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:33:19 np0005593294 nova_compute[225705]: 2026-01-23 10:33:19.411 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:33:19 np0005593294 nova_compute[225705]: 2026-01-23 10:33:19.412 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:19.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:19.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:20 np0005593294 nova_compute[225705]: 2026-01-23 10:33:20.358 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:20 np0005593294 nova_compute[225705]: 2026-01-23 10:33:20.413 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:20 np0005593294 nova_compute[225705]: 2026-01-23 10:33:20.414 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:20 np0005593294 nova_compute[225705]: 2026-01-23 10:33:20.414 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:33:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:21.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:21.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:21 np0005593294 nova_compute[225705]: 2026-01-23 10:33:21.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:22 np0005593294 nova_compute[225705]: 2026-01-23 10:33:22.218 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:33:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:23.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:33:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:23.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:23 np0005593294 nova_compute[225705]: 2026-01-23 10:33:23.869 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:24 np0005593294 podman[247302]: 2026-01-23 10:33:24.682088174 +0000 UTC m=+0.085234011 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:33:25 np0005593294 nova_compute[225705]: 2026-01-23 10:33:25.360 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:25.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:25.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:27 np0005593294 nova_compute[225705]: 2026-01-23 10:33:27.220 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:27.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:27.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:29.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:29.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:30 np0005593294 nova_compute[225705]: 2026-01-23 10:33:30.364 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:31.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:31.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:32 np0005593294 nova_compute[225705]: 2026-01-23 10:33:32.223 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:33.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:33.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:35 np0005593294 nova_compute[225705]: 2026-01-23 10:33:35.406 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:35.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:35.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:37 np0005593294 nova_compute[225705]: 2026-01-23 10:33:37.224 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:37.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:37.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:39.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:39.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:40 np0005593294 nova_compute[225705]: 2026-01-23 10:33:40.411 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:40 np0005593294 podman[247361]: 2026-01-23 10:33:40.644860829 +0000 UTC m=+0.052944175 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:33:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:41.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:41.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:42 np0005593294 nova_compute[225705]: 2026-01-23 10:33:42.257 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:43.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:43.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:45 np0005593294 nova_compute[225705]: 2026-01-23 10:33:45.414 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:45.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:45.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:47 np0005593294 nova_compute[225705]: 2026-01-23 10:33:47.259 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:47.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:47.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:49.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:49.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:50 np0005593294 nova_compute[225705]: 2026-01-23 10:33:50.415 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:51.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:51.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:52 np0005593294 nova_compute[225705]: 2026-01-23 10:33:52.263 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:53.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:53.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:33:55.066 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:33:55.066 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:33:55.066 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:55 np0005593294 nova_compute[225705]: 2026-01-23 10:33:55.418 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:55.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:55 np0005593294 podman[247415]: 2026-01-23 10:33:55.709455459 +0000 UTC m=+0.115772551 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 05:33:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:55.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:33:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:33:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:33:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:33:57 np0005593294 nova_compute[225705]: 2026-01-23 10:33:57.304 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:57.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:57.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:59.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:33:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:59.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:00 np0005593294 nova_compute[225705]: 2026-01-23 10:34:00.421 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:01.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:01.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:02 np0005593294 nova_compute[225705]: 2026-01-23 10:34:02.306 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:03.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:03.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:05 np0005593294 nova_compute[225705]: 2026-01-23 10:34:05.424 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:05.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:05.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:06 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:34:06 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:34:06 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:34:06 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:34:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:07 np0005593294 nova_compute[225705]: 2026-01-23 10:34:07.313 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:07.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:07.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:09.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:09.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:10 np0005593294 nova_compute[225705]: 2026-01-23 10:34:10.429 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:11 np0005593294 podman[247579]: 2026-01-23 10:34:11.497007685 +0000 UTC m=+0.063332652 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:34:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:11.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:11.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:34:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:34:12 np0005593294 nova_compute[225705]: 2026-01-23 10:34:12.319 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:13.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:13 np0005593294 nova_compute[225705]: 2026-01-23 10:34:13.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:13 np0005593294 nova_compute[225705]: 2026-01-23 10:34:13.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:34:13 np0005593294 nova_compute[225705]: 2026-01-23 10:34:13.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:34:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:13.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:13 np0005593294 nova_compute[225705]: 2026-01-23 10:34:13.902 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:34:15 np0005593294 nova_compute[225705]: 2026-01-23 10:34:15.433 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:15.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:15.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:16 np0005593294 nova_compute[225705]: 2026-01-23 10:34:16.896 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:17 np0005593294 nova_compute[225705]: 2026-01-23 10:34:17.324 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:17.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:17 np0005593294 nova_compute[225705]: 2026-01-23 10:34:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:17 np0005593294 nova_compute[225705]: 2026-01-23 10:34:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:17.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:18 np0005593294 nova_compute[225705]: 2026-01-23 10:34:18.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:18 np0005593294 nova_compute[225705]: 2026-01-23 10:34:18.928 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:18 np0005593294 nova_compute[225705]: 2026-01-23 10:34:18.929 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:18 np0005593294 nova_compute[225705]: 2026-01-23 10:34:18.929 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:18 np0005593294 nova_compute[225705]: 2026-01-23 10:34:18.929 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:34:18 np0005593294 nova_compute[225705]: 2026-01-23 10:34:18.930 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:34:19 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1917450218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:19 np0005593294 nova_compute[225705]: 2026-01-23 10:34:19.392 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:19.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:19 np0005593294 nova_compute[225705]: 2026-01-23 10:34:19.573 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:34:19 np0005593294 nova_compute[225705]: 2026-01-23 10:34:19.574 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4861MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:34:19 np0005593294 nova_compute[225705]: 2026-01-23 10:34:19.574 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:19 np0005593294 nova_compute[225705]: 2026-01-23 10:34:19.575 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:19.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.106 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.107 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.132 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.158 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.159 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.176 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.211 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.228 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.481 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:20 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:34:20 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1049699909' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.735 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.743 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.890 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.892 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:34:20 np0005593294 nova_compute[225705]: 2026-01-23 10:34:20.893 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:21.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:21.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:21 np0005593294 nova_compute[225705]: 2026-01-23 10:34:21.893 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:21 np0005593294 nova_compute[225705]: 2026-01-23 10:34:21.894 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:21 np0005593294 nova_compute[225705]: 2026-01-23 10:34:21.894 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:21 np0005593294 nova_compute[225705]: 2026-01-23 10:34:21.894 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:34:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:22 np0005593294 nova_compute[225705]: 2026-01-23 10:34:22.327 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:23.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:23 np0005593294 nova_compute[225705]: 2026-01-23 10:34:23.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:23.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:25 np0005593294 nova_compute[225705]: 2026-01-23 10:34:25.484 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:25.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:25.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:26 np0005593294 podman[247651]: 2026-01-23 10:34:26.726587849 +0000 UTC m=+0.109790202 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Jan 23 05:34:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:27 np0005593294 nova_compute[225705]: 2026-01-23 10:34:27.328 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:27.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:27.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:29.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:29.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:30 np0005593294 nova_compute[225705]: 2026-01-23 10:34:30.487 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:31.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:31.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:32 np0005593294 nova_compute[225705]: 2026-01-23 10:34:32.331 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:33.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:33.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:35 np0005593294 nova_compute[225705]: 2026-01-23 10:34:35.491 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:35.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:35.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:37 np0005593294 nova_compute[225705]: 2026-01-23 10:34:37.374 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:37.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:37.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:38 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:34:38 np0005593294 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:34:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:39.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:39.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:40 np0005593294 nova_compute[225705]: 2026-01-23 10:34:40.495 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:41.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:41 np0005593294 podman[247711]: 2026-01-23 10:34:41.660940103 +0000 UTC m=+0.062141585 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:34:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:41.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:42 np0005593294 nova_compute[225705]: 2026-01-23 10:34:42.376 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:43.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:43.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:45 np0005593294 nova_compute[225705]: 2026-01-23 10:34:45.497 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:45.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:45.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:47 np0005593294 nova_compute[225705]: 2026-01-23 10:34:47.377 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:47.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:47.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:49.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:49.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:50 np0005593294 nova_compute[225705]: 2026-01-23 10:34:50.501 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:51.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:51.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:52 np0005593294 nova_compute[225705]: 2026-01-23 10:34:52.378 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:53.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:53.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:34:55.067 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:34:55.068 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:34:55.068 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:55 np0005593294 nova_compute[225705]: 2026-01-23 10:34:55.505 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:55.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:55.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:34:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:34:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:34:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:34:57 np0005593294 nova_compute[225705]: 2026-01-23 10:34:57.380 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:57.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:57 np0005593294 podman[247764]: 2026-01-23 10:34:57.714607327 +0000 UTC m=+0.116012069 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:34:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:57.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:59.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:34:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:59.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:00 np0005593294 nova_compute[225705]: 2026-01-23 10:35:00.507 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:01.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:01.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:02 np0005593294 nova_compute[225705]: 2026-01-23 10:35:02.381 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:03.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:03.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:05 np0005593294 nova_compute[225705]: 2026-01-23 10:35:05.510 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:05.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:05.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:07 np0005593294 nova_compute[225705]: 2026-01-23 10:35:07.383 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:07.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:07.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:09.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:09.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:10 np0005593294 nova_compute[225705]: 2026-01-23 10:35:10.528 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:11.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:11 np0005593294 podman[247872]: 2026-01-23 10:35:11.763389873 +0000 UTC m=+0.054547145 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:35:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:11.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:12 np0005593294 nova_compute[225705]: 2026-01-23 10:35:12.385 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:35:12 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:35:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:13.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:13 np0005593294 nova_compute[225705]: 2026-01-23 10:35:13.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:13 np0005593294 nova_compute[225705]: 2026-01-23 10:35:13.877 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:35:13 np0005593294 nova_compute[225705]: 2026-01-23 10:35:13.877 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:35:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:13.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:14 np0005593294 nova_compute[225705]: 2026-01-23 10:35:14.536 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:35:15 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:15 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:15 np0005593294 nova_compute[225705]: 2026-01-23 10:35:15.532 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:15.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:15.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:35:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:35:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:17 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:35:17 np0005593294 nova_compute[225705]: 2026-01-23 10:35:17.388 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:17 np0005593294 nova_compute[225705]: 2026-01-23 10:35:17.529 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:17.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:17.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:18 np0005593294 nova_compute[225705]: 2026-01-23 10:35:18.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:19.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:19 np0005593294 nova_compute[225705]: 2026-01-23 10:35:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:19 np0005593294 nova_compute[225705]: 2026-01-23 10:35:19.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:19 np0005593294 nova_compute[225705]: 2026-01-23 10:35:19.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:19 np0005593294 nova_compute[225705]: 2026-01-23 10:35:19.924 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:19 np0005593294 nova_compute[225705]: 2026-01-23 10:35:19.925 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:19 np0005593294 nova_compute[225705]: 2026-01-23 10:35:19.925 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:19 np0005593294 nova_compute[225705]: 2026-01-23 10:35:19.925 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:35:19 np0005593294 nova_compute[225705]: 2026-01-23 10:35:19.926 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:19.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:20 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:35:20 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/905573868' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:35:20 np0005593294 nova_compute[225705]: 2026-01-23 10:35:20.414 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:20 np0005593294 nova_compute[225705]: 2026-01-23 10:35:20.535 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:20 np0005593294 nova_compute[225705]: 2026-01-23 10:35:20.620 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:35:20 np0005593294 nova_compute[225705]: 2026-01-23 10:35:20.622 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4854MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:35:20 np0005593294 nova_compute[225705]: 2026-01-23 10:35:20.623 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:20 np0005593294 nova_compute[225705]: 2026-01-23 10:35:20.624 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:20 np0005593294 nova_compute[225705]: 2026-01-23 10:35:20.853 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:35:20 np0005593294 nova_compute[225705]: 2026-01-23 10:35:20.854 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:35:20 np0005593294 nova_compute[225705]: 2026-01-23 10:35:20.875 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:21 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:35:21 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/573725531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:35:21 np0005593294 nova_compute[225705]: 2026-01-23 10:35:21.374 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:21 np0005593294 nova_compute[225705]: 2026-01-23 10:35:21.384 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:35:21 np0005593294 nova_compute[225705]: 2026-01-23 10:35:21.464 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:35:21 np0005593294 nova_compute[225705]: 2026-01-23 10:35:21.467 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:35:21 np0005593294 nova_compute[225705]: 2026-01-23 10:35:21.468 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:21.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:21.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:22 np0005593294 nova_compute[225705]: 2026-01-23 10:35:22.391 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:23 np0005593294 nova_compute[225705]: 2026-01-23 10:35:23.468 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:23 np0005593294 nova_compute[225705]: 2026-01-23 10:35:23.469 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:23 np0005593294 nova_compute[225705]: 2026-01-23 10:35:23.469 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:35:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:23.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:23 np0005593294 nova_compute[225705]: 2026-01-23 10:35:23.869 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:23.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:24 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:25 np0005593294 nova_compute[225705]: 2026-01-23 10:35:25.539 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:25.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:25 np0005593294 nova_compute[225705]: 2026-01-23 10:35:25.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:25.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:27 np0005593294 nova_compute[225705]: 2026-01-23 10:35:27.393 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:27.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:27.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:28 np0005593294 podman[248001]: 2026-01-23 10:35:28.732061928 +0000 UTC m=+0.123939487 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:35:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:29 np0005593294 nova_compute[225705]: 2026-01-23 10:35:29.580 225709 DEBUG oslo_concurrency.processutils [None req-a8510dbf-d677-4163-b681-0279df98cd8c 00aca23f964f49a5a9abfea9744e871b 5220cd4f58cb43bb899e367e961bc5c1 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:29 np0005593294 nova_compute[225705]: 2026-01-23 10:35:29.614 225709 DEBUG oslo_concurrency.processutils [None req-a8510dbf-d677-4163-b681-0279df98cd8c 00aca23f964f49a5a9abfea9744e871b 5220cd4f58cb43bb899e367e961bc5c1 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:29.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:29.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:30 np0005593294 nova_compute[225705]: 2026-01-23 10:35:30.543 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:31.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:31.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:32 np0005593294 nova_compute[225705]: 2026-01-23 10:35:32.415 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:35:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:33.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:35:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:33.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:35 np0005593294 nova_compute[225705]: 2026-01-23 10:35:35.545 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:35:35.629 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:35:35 np0005593294 nova_compute[225705]: 2026-01-23 10:35:35.630 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:35:35.631 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:35:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:35.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:35:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:35.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:35:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:37 np0005593294 nova_compute[225705]: 2026-01-23 10:35:37.418 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:37.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:37.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:39.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:39.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:40 np0005593294 nova_compute[225705]: 2026-01-23 10:35:40.547 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:40 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:35:40.634 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:35:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:41.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:35:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:41.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:42 np0005593294 nova_compute[225705]: 2026-01-23 10:35:42.420 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:42 np0005593294 podman[248060]: 2026-01-23 10:35:42.653338549 +0000 UTC m=+0.059426050 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:35:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:43.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:43.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:45 np0005593294 nova_compute[225705]: 2026-01-23 10:35:45.550 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:45.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:45.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:47 np0005593294 nova_compute[225705]: 2026-01-23 10:35:47.423 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:47.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:47.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:49.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:50.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:50 np0005593294 nova_compute[225705]: 2026-01-23 10:35:50.554 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:51.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:52.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:52 np0005593294 nova_compute[225705]: 2026-01-23 10:35:52.463 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:53.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:54.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:54 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 23 05:35:54 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:54.992564) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:35:54 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 23 05:35:54 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164554992603, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1869, "num_deletes": 251, "total_data_size": 4935419, "memory_usage": 5020464, "flush_reason": "Manual Compaction"}
Jan 23 05:35:54 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 23 05:35:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:35:55.068 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:35:55.068 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:35:55.069 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555362000, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 3204080, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37512, "largest_seqno": 39376, "table_properties": {"data_size": 3196224, "index_size": 4735, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16131, "raw_average_key_size": 20, "raw_value_size": 3180644, "raw_average_value_size": 3995, "num_data_blocks": 201, "num_entries": 796, "num_filter_entries": 796, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164388, "oldest_key_time": 1769164388, "file_creation_time": 1769164554, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 369483 microseconds, and 6350 cpu microseconds.
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:35:55 np0005593294 nova_compute[225705]: 2026-01-23 10:35:55.557 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.362046) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 3204080 bytes OK
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.362068) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.681309) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.681374) EVENT_LOG_v1 {"time_micros": 1769164555681361, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.681406) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 4927089, prev total WAL file size 4927370, number of live WAL files 2.
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.683586) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(3128KB)], [72(11MB)]
Jan 23 05:35:55 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555683677, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15335931, "oldest_snapshot_seqno": -1}
Jan 23 05:35:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:55.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:56.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6840 keys, 13115150 bytes, temperature: kUnknown
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164556443953, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13115150, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13071551, "index_size": 25375, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 179722, "raw_average_key_size": 26, "raw_value_size": 12950149, "raw_average_value_size": 1893, "num_data_blocks": 991, "num_entries": 6840, "num_filter_entries": 6840, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164555, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.444406) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13115150 bytes
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.550432) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 20.2 rd, 17.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 11.6 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(8.9) write-amplify(4.1) OK, records in: 7358, records dropped: 518 output_compression: NoCompression
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.550528) EVENT_LOG_v1 {"time_micros": 1769164556550464, "job": 44, "event": "compaction_finished", "compaction_time_micros": 760396, "compaction_time_cpu_micros": 55113, "output_level": 6, "num_output_files": 1, "total_output_size": 13115150, "num_input_records": 7358, "num_output_records": 6840, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164556552228, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164556556941, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.683417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557154) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557157) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:56 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:35:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:35:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:35:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:35:57 np0005593294 nova_compute[225705]: 2026-01-23 10:35:57.467 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:57.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:35:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:59.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:59 np0005593294 podman[248113]: 2026-01-23 10:35:59.753024444 +0000 UTC m=+0.151870787 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 05:36:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:00.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:00 np0005593294 nova_compute[225705]: 2026-01-23 10:36:00.560 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:01.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:02.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:02 np0005593294 nova_compute[225705]: 2026-01-23 10:36:02.469 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:03.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:04.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:05 np0005593294 nova_compute[225705]: 2026-01-23 10:36:05.563 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:05.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:06.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:07 np0005593294 nova_compute[225705]: 2026-01-23 10:36:07.471 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:07.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:08.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:09.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:10.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:10 np0005593294 nova_compute[225705]: 2026-01-23 10:36:10.572 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:11.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:12.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:12 np0005593294 nova_compute[225705]: 2026-01-23 10:36:12.473 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:13 np0005593294 podman[248172]: 2026-01-23 10:36:13.669568874 +0000 UTC m=+0.074522555 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 05:36:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:13.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:13 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:14.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:14 np0005593294 nova_compute[225705]: 2026-01-23 10:36:14.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:14 np0005593294 nova_compute[225705]: 2026-01-23 10:36:14.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:36:14 np0005593294 nova_compute[225705]: 2026-01-23 10:36:14.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:36:14 np0005593294 nova_compute[225705]: 2026-01-23 10:36:14.978 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:36:14 np0005593294 nova_compute[225705]: 2026-01-23 10:36:14.979 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:14 np0005593294 nova_compute[225705]: 2026-01-23 10:36:14.979 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:36:15 np0005593294 nova_compute[225705]: 2026-01-23 10:36:15.578 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:15.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:16.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:17 np0005593294 nova_compute[225705]: 2026-01-23 10:36:17.135 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:17 np0005593294 nova_compute[225705]: 2026-01-23 10:36:17.473 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:17.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:18.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:18 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:19.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:19 np0005593294 nova_compute[225705]: 2026-01-23 10:36:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:19 np0005593294 nova_compute[225705]: 2026-01-23 10:36:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:20.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.209 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.210 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.210 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.211 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.211 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.582 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:20 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:36:20 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1725171694' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.683 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.854 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.855 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4854MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.855 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:20 np0005593294 nova_compute[225705]: 2026-01-23 10:36:20.856 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.109 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.110 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.155 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:21 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:36:21 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1801997280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.620 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.629 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:21.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.838 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.841 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.841 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.842 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:21 np0005593294 nova_compute[225705]: 2026-01-23 10:36:21.842 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:36:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:22.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:22 np0005593294 nova_compute[225705]: 2026-01-23 10:36:22.476 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:22 np0005593294 nova_compute[225705]: 2026-01-23 10:36:22.661 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:36:23 np0005593294 nova_compute[225705]: 2026-01-23 10:36:23.661 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:23 np0005593294 nova_compute[225705]: 2026-01-23 10:36:23.662 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:23 np0005593294 nova_compute[225705]: 2026-01-23 10:36:23.662 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:23.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:23 np0005593294 nova_compute[225705]: 2026-01-23 10:36:23.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:23 np0005593294 nova_compute[225705]: 2026-01-23 10:36:23.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:36:23 np0005593294 nova_compute[225705]: 2026-01-23 10:36:23.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:24.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:24 np0005593294 podman[248362]: 2026-01-23 10:36:24.281495924 +0000 UTC m=+0.222571778 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:36:24 np0005593294 podman[248362]: 2026-01-23 10:36:24.450666143 +0000 UTC m=+0.391741697 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:36:25 np0005593294 podman[248481]: 2026-01-23 10:36:25.051930778 +0000 UTC m=+0.080873334 container exec 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:36:25 np0005593294 podman[248505]: 2026-01-23 10:36:25.130713945 +0000 UTC m=+0.058783029 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:36:25 np0005593294 podman[248481]: 2026-01-23 10:36:25.149881187 +0000 UTC m=+0.178823733 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:36:25 np0005593294 podman[248567]: 2026-01-23 10:36:25.544241075 +0000 UTC m=+0.071162248 container exec 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:36:25 np0005593294 podman[248567]: 2026-01-23 10:36:25.564074098 +0000 UTC m=+0.090995271 container exec_died 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:36:25 np0005593294 nova_compute[225705]: 2026-01-23 10:36:25.584 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:25.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:25 np0005593294 podman[248634]: 2026-01-23 10:36:25.960836783 +0000 UTC m=+0.208510166 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 05:36:25 np0005593294 podman[248634]: 2026-01-23 10:36:25.976158045 +0000 UTC m=+0.223831388 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 05:36:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:26.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:26 np0005593294 nova_compute[225705]: 2026-01-23 10:36:26.165 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:26 np0005593294 podman[248700]: 2026-01-23 10:36:26.269671903 +0000 UTC m=+0.063005402 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, io.buildah.version=1.28.2, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, name=keepalived, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, build-date=2023-02-22T09:23:20)
Jan 23 05:36:26 np0005593294 podman[248720]: 2026-01-23 10:36:26.465751069 +0000 UTC m=+0.169287165 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, description=keepalived for Ceph, release=1793, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., version=2.2.4, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, io.openshift.expose-services=)
Jan 23 05:36:26 np0005593294 podman[248700]: 2026-01-23 10:36:26.607311299 +0000 UTC m=+0.400644818 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, version=2.2.4, description=keepalived for Ceph, release=1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, architecture=x86_64)
Jan 23 05:36:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:27 np0005593294 nova_compute[225705]: 2026-01-23 10:36:27.477 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:27 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:27 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:27.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:28.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:36:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:28 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:36:28 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:29.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:30.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:30 np0005593294 nova_compute[225705]: 2026-01-23 10:36:30.587 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:30 np0005593294 podman[248841]: 2026-01-23 10:36:30.760964976 +0000 UTC m=+0.142224424 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:36:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:31.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:32.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:32 np0005593294 nova_compute[225705]: 2026-01-23 10:36:32.480 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:33.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:33 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:34.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:34 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:35 np0005593294 nova_compute[225705]: 2026-01-23 10:36:35.591 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:35.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:35 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:36.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:37 np0005593294 nova_compute[225705]: 2026-01-23 10:36:37.484 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:37.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:38.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:38 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:39.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:40.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:40 np0005593294 nova_compute[225705]: 2026-01-23 10:36:40.594 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:41.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:42.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:42 np0005593294 nova_compute[225705]: 2026-01-23 10:36:42.484 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:43.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:43 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:44.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:44 np0005593294 podman[248901]: 2026-01-23 10:36:44.647275374 +0000 UTC m=+0.052377179 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 05:36:45 np0005593294 nova_compute[225705]: 2026-01-23 10:36:45.599 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:45.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:46.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:47 np0005593294 nova_compute[225705]: 2026-01-23 10:36:47.486 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:47.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:48.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:48 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:49.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:50.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:50 np0005593294 nova_compute[225705]: 2026-01-23 10:36:50.602 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:51.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:52.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:52 np0005593294 nova_compute[225705]: 2026-01-23 10:36:52.489 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:53.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:53 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:54.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:36:55.070 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:36:55.070 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:36:55.070 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:55 np0005593294 nova_compute[225705]: 2026-01-23 10:36:55.607 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:55.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:56.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:36:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:36:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:36:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:36:57 np0005593294 nova_compute[225705]: 2026-01-23 10:36:57.492 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:57.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:58.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:36:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:59.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:00.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:00 np0005593294 nova_compute[225705]: 2026-01-23 10:37:00.610 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:01 np0005593294 podman[248955]: 2026-01-23 10:37:01.75596955 +0000 UTC m=+0.155487999 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 23 05:37:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:01.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:02.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:02 np0005593294 nova_compute[225705]: 2026-01-23 10:37:02.494 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:03.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:04.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:05 np0005593294 nova_compute[225705]: 2026-01-23 10:37:05.614 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:05.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:06.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:07 np0005593294 nova_compute[225705]: 2026-01-23 10:37:07.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:07.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:08.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:09.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:10.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:10 np0005593294 nova_compute[225705]: 2026-01-23 10:37:10.618 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:11.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:12.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:12 np0005593294 nova_compute[225705]: 2026-01-23 10:37:12.497 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:13.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:14.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:15 np0005593294 nova_compute[225705]: 2026-01-23 10:37:15.621 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:15 np0005593294 podman[249013]: 2026-01-23 10:37:15.666292425 +0000 UTC m=+0.068900318 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:37:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:15.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:15 np0005593294 nova_compute[225705]: 2026-01-23 10:37:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:15 np0005593294 nova_compute[225705]: 2026-01-23 10:37:15.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:37:15 np0005593294 nova_compute[225705]: 2026-01-23 10:37:15.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:37:15 np0005593294 nova_compute[225705]: 2026-01-23 10:37:15.911 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:37:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:16.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:16 np0005593294 nova_compute[225705]: 2026-01-23 10:37:16.905 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:17 np0005593294 nova_compute[225705]: 2026-01-23 10:37:17.499 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:17.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:18.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:19.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:19 np0005593294 nova_compute[225705]: 2026-01-23 10:37:19.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:19 np0005593294 nova_compute[225705]: 2026-01-23 10:37:19.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:19 np0005593294 nova_compute[225705]: 2026-01-23 10:37:19.942 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:19 np0005593294 nova_compute[225705]: 2026-01-23 10:37:19.943 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:19 np0005593294 nova_compute[225705]: 2026-01-23 10:37:19.943 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:19 np0005593294 nova_compute[225705]: 2026-01-23 10:37:19.944 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:37:19 np0005593294 nova_compute[225705]: 2026-01-23 10:37:19.944 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:20.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:20 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:37:20 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2129370149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:20 np0005593294 nova_compute[225705]: 2026-01-23 10:37:20.432 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:20 np0005593294 nova_compute[225705]: 2026-01-23 10:37:20.604 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:37:20 np0005593294 nova_compute[225705]: 2026-01-23 10:37:20.606 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4849MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:37:20 np0005593294 nova_compute[225705]: 2026-01-23 10:37:20.606 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:20 np0005593294 nova_compute[225705]: 2026-01-23 10:37:20.607 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:20 np0005593294 nova_compute[225705]: 2026-01-23 10:37:20.625 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:20 np0005593294 nova_compute[225705]: 2026-01-23 10:37:20.745 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:37:20 np0005593294 nova_compute[225705]: 2026-01-23 10:37:20.746 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:37:20 np0005593294 nova_compute[225705]: 2026-01-23 10:37:20.805 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:21 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:37:21 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3810051989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:21 np0005593294 nova_compute[225705]: 2026-01-23 10:37:21.263 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:21 np0005593294 nova_compute[225705]: 2026-01-23 10:37:21.270 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:37:21 np0005593294 nova_compute[225705]: 2026-01-23 10:37:21.289 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:37:21 np0005593294 nova_compute[225705]: 2026-01-23 10:37:21.291 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:37:21 np0005593294 nova_compute[225705]: 2026-01-23 10:37:21.292 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:21.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:22.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:22 np0005593294 nova_compute[225705]: 2026-01-23 10:37:22.502 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:23 np0005593294 nova_compute[225705]: 2026-01-23 10:37:23.292 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:23 np0005593294 nova_compute[225705]: 2026-01-23 10:37:23.292 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:23.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:23 np0005593294 nova_compute[225705]: 2026-01-23 10:37:23.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:24.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:24 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:24 np0005593294 nova_compute[225705]: 2026-01-23 10:37:24.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:24 np0005593294 nova_compute[225705]: 2026-01-23 10:37:24.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:37:25 np0005593294 nova_compute[225705]: 2026-01-23 10:37:25.629 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:25.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:25 np0005593294 nova_compute[225705]: 2026-01-23 10:37:25.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:26.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:26 np0005593294 nova_compute[225705]: 2026-01-23 10:37:26.869 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:27 np0005593294 nova_compute[225705]: 2026-01-23 10:37:27.505 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:27.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:28.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:29 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:29.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:30.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:30 np0005593294 nova_compute[225705]: 2026-01-23 10:37:30.633 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:31.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:32.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:32 np0005593294 nova_compute[225705]: 2026-01-23 10:37:32.507 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:32 np0005593294 podman[249109]: 2026-01-23 10:37:32.684646214 +0000 UTC m=+0.086920764 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 05:37:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:33.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:34.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:35 np0005593294 nova_compute[225705]: 2026-01-23 10:37:35.637 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:35.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:36.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:37 np0005593294 nova_compute[225705]: 2026-01-23 10:37:37.511 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:37.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:38.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:39 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:39.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:40.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:40 np0005593294 nova_compute[225705]: 2026-01-23 10:37:40.690 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:41 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:41.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:42.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:42 np0005593294 nova_compute[225705]: 2026-01-23 10:37:42.513 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:42 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:37:42 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:42 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:42 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:37:43 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:43 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:43 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:43.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:44.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:44 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:45 np0005593294 nova_compute[225705]: 2026-01-23 10:37:45.742 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:45 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:45 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:45 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:45.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:46.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:46 np0005593294 podman[249222]: 2026-01-23 10:37:46.647370425 +0000 UTC m=+0.054171033 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:37:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:47 np0005593294 nova_compute[225705]: 2026-01-23 10:37:47.515 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:47 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:47 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:47 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:47.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:48.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.073462) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669073530, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1357, "num_deletes": 257, "total_data_size": 3394338, "memory_usage": 3463760, "flush_reason": "Manual Compaction"}
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669360911, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2198714, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39382, "largest_seqno": 40733, "table_properties": {"data_size": 2192833, "index_size": 3208, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12447, "raw_average_key_size": 19, "raw_value_size": 2180973, "raw_average_value_size": 3450, "num_data_blocks": 137, "num_entries": 632, "num_filter_entries": 632, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164555, "oldest_key_time": 1769164555, "file_creation_time": 1769164669, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 287509 microseconds, and 9364 cpu microseconds.
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.360968) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2198714 bytes OK
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.360992) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.402440) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.402542) EVENT_LOG_v1 {"time_micros": 1769164669402479, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.402573) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3387978, prev total WAL file size 3405698, number of live WAL files 2.
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.403969) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303130' seq:72057594037927935, type:22 .. '6C6F676D0031323633' seq:0, type:0; will stop at (end)
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2147KB)], [75(12MB)]
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669404063, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15313864, "oldest_snapshot_seqno": -1}
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6942 keys, 15161136 bytes, temperature: kUnknown
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669506052, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15161136, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15114592, "index_size": 28064, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17413, "raw_key_size": 182798, "raw_average_key_size": 26, "raw_value_size": 14989250, "raw_average_value_size": 2159, "num_data_blocks": 1100, "num_entries": 6942, "num_filter_entries": 6942, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164669, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.507212) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15161136 bytes
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.508896) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.0 rd, 148.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 12.5 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(13.9) write-amplify(6.9) OK, records in: 7472, records dropped: 530 output_compression: NoCompression
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.508927) EVENT_LOG_v1 {"time_micros": 1769164669508915, "job": 46, "event": "compaction_finished", "compaction_time_micros": 102059, "compaction_time_cpu_micros": 37232, "output_level": 6, "num_output_files": 1, "total_output_size": 15161136, "num_input_records": 7472, "num_output_records": 6942, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669509437, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669512107, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.403833) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512149) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512154) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512156) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:49 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:49 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:49.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:50.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:50 np0005593294 nova_compute[225705]: 2026-01-23 10:37:50.759 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:51 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:51 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:51 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:51.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:52.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:52 np0005593294 nova_compute[225705]: 2026-01-23 10:37:52.518 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:53 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:53 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:53 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:53.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:54.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:54 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:37:55.071 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:37:55.072 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:37:55.072 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:55 np0005593294 nova_compute[225705]: 2026-01-23 10:37:55.833 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:55 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:55 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:55 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:55.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:56.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:56 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:37:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:37:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:37:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:37:57 np0005593294 nova_compute[225705]: 2026-01-23 10:37:57.520 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:57 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:57 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:57 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:57.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:58.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:59 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:59 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:37:59 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:59 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:59.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:00.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:00 np0005593294 nova_compute[225705]: 2026-01-23 10:38:00.846 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:01 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:01 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:01 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:01.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:02.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:02 np0005593294 nova_compute[225705]: 2026-01-23 10:38:02.522 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:03 np0005593294 systemd[1]: Starting dnf makecache...
Jan 23 05:38:03 np0005593294 podman[249300]: 2026-01-23 10:38:03.697378857 +0000 UTC m=+0.100779670 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 05:38:03 np0005593294 dnf[249301]: Metadata cache refreshed recently.
Jan 23 05:38:03 np0005593294 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 05:38:03 np0005593294 systemd[1]: Finished dnf makecache.
Jan 23 05:38:03 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:03 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:03 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:03.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:04.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:04 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:05 np0005593294 nova_compute[225705]: 2026-01-23 10:38:05.885 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:05 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:05 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:05 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:05.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:06.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:07 np0005593294 nova_compute[225705]: 2026-01-23 10:38:07.525 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:07 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:07 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:07 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:07.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:08.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:09 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:09 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:09 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:09.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 05:38:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:10.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 05:38:10 np0005593294 nova_compute[225705]: 2026-01-23 10:38:10.925 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:11 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:11 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:11 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:11.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:12.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:12 np0005593294 nova_compute[225705]: 2026-01-23 10:38:12.528 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:13 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:13 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:13 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:13.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:14.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:15 np0005593294 nova_compute[225705]: 2026-01-23 10:38:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:15 np0005593294 nova_compute[225705]: 2026-01-23 10:38:15.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:38:15 np0005593294 nova_compute[225705]: 2026-01-23 10:38:15.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:38:15 np0005593294 nova_compute[225705]: 2026-01-23 10:38:15.891 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:38:15 np0005593294 nova_compute[225705]: 2026-01-23 10:38:15.930 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:15 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:15 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:15 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:15.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:16.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:17 np0005593294 nova_compute[225705]: 2026-01-23 10:38:17.531 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:17 np0005593294 podman[249360]: 2026-01-23 10:38:17.7074783 +0000 UTC m=+0.092225550 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:38:17 np0005593294 nova_compute[225705]: 2026-01-23 10:38:17.886 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:17 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:17 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:17 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:17.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:18.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:19 np0005593294 nova_compute[225705]: 2026-01-23 10:38:19.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:19 np0005593294 nova_compute[225705]: 2026-01-23 10:38:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:19 np0005593294 nova_compute[225705]: 2026-01-23 10:38:19.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:19 np0005593294 nova_compute[225705]: 2026-01-23 10:38:19.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:19 np0005593294 nova_compute[225705]: 2026-01-23 10:38:19.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:19 np0005593294 nova_compute[225705]: 2026-01-23 10:38:19.904 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:38:19 np0005593294 nova_compute[225705]: 2026-01-23 10:38:19.904 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:19 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:19 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:19 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:19.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:20.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:20 np0005593294 nova_compute[225705]: 2026-01-23 10:38:20.934 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:21 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:38:21 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2310328428' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.077 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.287 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.289 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4836MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.290 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.292 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.361 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.361 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.437 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:21 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:38:21 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/387035444' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.937 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.945 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:38:21 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:21 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:21 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:21.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.964 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.967 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:38:21 np0005593294 nova_compute[225705]: 2026-01-23 10:38:21.967 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:22.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:22 np0005593294 nova_compute[225705]: 2026-01-23 10:38:22.533 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:23 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:23 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:23 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:23.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:24.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:24 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:25 np0005593294 nova_compute[225705]: 2026-01-23 10:38:25.936 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:25 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:25 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:25 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:25.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:25 np0005593294 nova_compute[225705]: 2026-01-23 10:38:25.967 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:25 np0005593294 nova_compute[225705]: 2026-01-23 10:38:25.968 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:25 np0005593294 nova_compute[225705]: 2026-01-23 10:38:25.968 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:25 np0005593294 nova_compute[225705]: 2026-01-23 10:38:25.968 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:25 np0005593294 nova_compute[225705]: 2026-01-23 10:38:25.969 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:25 np0005593294 nova_compute[225705]: 2026-01-23 10:38:25.969 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:38:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:26.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:27 np0005593294 nova_compute[225705]: 2026-01-23 10:38:27.535 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:27 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:27 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:27 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:27.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:28.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:29 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:29 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:29 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:29 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:29.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:30.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:30 np0005593294 nova_compute[225705]: 2026-01-23 10:38:30.938 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:31 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:31 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:31 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:31.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:32.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:32 np0005593294 nova_compute[225705]: 2026-01-23 10:38:32.539 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:33 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:33 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:33 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:33.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:34.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:34 np0005593294 podman[249458]: 2026-01-23 10:38:34.711098523 +0000 UTC m=+0.110435334 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:38:35 np0005593294 nova_compute[225705]: 2026-01-23 10:38:35.942 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:35 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:35 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:35 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:35.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:36.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:37 np0005593294 nova_compute[225705]: 2026-01-23 10:38:37.541 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:37 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:37 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:37 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:37.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:38.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:39 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:39 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:39 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:39 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:39.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:40.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:40 np0005593294 nova_compute[225705]: 2026-01-23 10:38:40.945 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:41 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:41 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:41 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:41.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:42.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:42 np0005593294 nova_compute[225705]: 2026-01-23 10:38:42.543 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:44.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:44.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:44 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:45 np0005593294 nova_compute[225705]: 2026-01-23 10:38:45.949 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:46.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:46.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:47 np0005593294 nova_compute[225705]: 2026-01-23 10:38:47.545 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:48.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:48.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:48 np0005593294 podman[249492]: 2026-01-23 10:38:48.669644645 +0000 UTC m=+0.060933416 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:38:49 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:50.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:50.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:50 np0005593294 nova_compute[225705]: 2026-01-23 10:38:50.953 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:52.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:52.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:52 np0005593294 nova_compute[225705]: 2026-01-23 10:38:52.547 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:54.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:54.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:54 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:38:55.072 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:38:55.073 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:38:55.073 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:55 np0005593294 nova_compute[225705]: 2026-01-23 10:38:55.957 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:56.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:56.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:38:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:38:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:38:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:38:57 np0005593294 nova_compute[225705]: 2026-01-23 10:38:57.548 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:58.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:38:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:58.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:58 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:38:58 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:38:58 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:38:58 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:38:59 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:00.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:00.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:00 np0005593294 nova_compute[225705]: 2026-01-23 10:39:00.959 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 05:39:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:02.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 05:39:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:02.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:02 np0005593294 nova_compute[225705]: 2026-01-23 10:39:02.549 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:04.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:04 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:39:04 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:39:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:04.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:04 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:05 np0005593294 podman[249653]: 2026-01-23 10:39:05.729536132 +0000 UTC m=+0.129121381 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:39:05 np0005593294 nova_compute[225705]: 2026-01-23 10:39:05.961 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:06.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:06.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:07 np0005593294 nova_compute[225705]: 2026-01-23 10:39:07.551 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:08.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:08.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:10.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:10.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:10 np0005593294 nova_compute[225705]: 2026-01-23 10:39:10.965 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:12.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:12.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:12 np0005593294 nova_compute[225705]: 2026-01-23 10:39:12.554 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:14.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:14.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:15 np0005593294 nova_compute[225705]: 2026-01-23 10:39:15.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:15 np0005593294 nova_compute[225705]: 2026-01-23 10:39:15.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:39:15 np0005593294 nova_compute[225705]: 2026-01-23 10:39:15.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:39:15 np0005593294 nova_compute[225705]: 2026-01-23 10:39:15.969 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:16.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:16 np0005593294 nova_compute[225705]: 2026-01-23 10:39:16.088 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:39:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:16.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:17 np0005593294 nova_compute[225705]: 2026-01-23 10:39:17.555 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:18.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:18 np0005593294 nova_compute[225705]: 2026-01-23 10:39:18.081 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:18.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:19 np0005593294 podman[249711]: 2026-01-23 10:39:19.654224368 +0000 UTC m=+0.055315150 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 05:39:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:20.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:20.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:20 np0005593294 nova_compute[225705]: 2026-01-23 10:39:20.973 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:21 np0005593294 nova_compute[225705]: 2026-01-23 10:39:21.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:21 np0005593294 nova_compute[225705]: 2026-01-23 10:39:21.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:22.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:22.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:22 np0005593294 nova_compute[225705]: 2026-01-23 10:39:22.546 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:22 np0005593294 nova_compute[225705]: 2026-01-23 10:39:22.547 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:22 np0005593294 nova_compute[225705]: 2026-01-23 10:39:22.547 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:22 np0005593294 nova_compute[225705]: 2026-01-23 10:39:22.547 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:39:22 np0005593294 nova_compute[225705]: 2026-01-23 10:39:22.547 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:22 np0005593294 nova_compute[225705]: 2026-01-23 10:39:22.564 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:23 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:39:23 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1085583944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.027 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.210 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.211 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4841MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.212 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.212 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.507 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.508 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.538 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.571 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.572 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.585 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.606 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:39:23 np0005593294 nova_compute[225705]: 2026-01-23 10:39:23.623 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:24.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:24 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:39:24 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2536873901' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:24 np0005593294 nova_compute[225705]: 2026-01-23 10:39:24.101 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:24 np0005593294 nova_compute[225705]: 2026-01-23 10:39:24.108 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:39:24 np0005593294 nova_compute[225705]: 2026-01-23 10:39:24.137 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:39:24 np0005593294 nova_compute[225705]: 2026-01-23 10:39:24.141 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:39:24 np0005593294 nova_compute[225705]: 2026-01-23 10:39:24.142 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:24.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:24 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:25 np0005593294 nova_compute[225705]: 2026-01-23 10:39:25.977 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:26.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:26.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:27 np0005593294 nova_compute[225705]: 2026-01-23 10:39:27.559 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:28.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:28 np0005593294 nova_compute[225705]: 2026-01-23 10:39:28.142 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:28 np0005593294 nova_compute[225705]: 2026-01-23 10:39:28.142 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:28 np0005593294 nova_compute[225705]: 2026-01-23 10:39:28.143 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:28 np0005593294 nova_compute[225705]: 2026-01-23 10:39:28.143 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:28 np0005593294 nova_compute[225705]: 2026-01-23 10:39:28.143 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:28 np0005593294 nova_compute[225705]: 2026-01-23 10:39:28.143 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:39:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:28.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:29 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:29 np0005593294 nova_compute[225705]: 2026-01-23 10:39:29.870 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:30.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:30.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:30 np0005593294 nova_compute[225705]: 2026-01-23 10:39:30.981 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:32.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:32.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:32 np0005593294 nova_compute[225705]: 2026-01-23 10:39:32.561 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:34.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:34.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:35 np0005593294 nova_compute[225705]: 2026-01-23 10:39:35.985 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:36.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:36.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:36 np0005593294 podman[249807]: 2026-01-23 10:39:36.708849686 +0000 UTC m=+0.109788353 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 23 05:39:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:37 np0005593294 nova_compute[225705]: 2026-01-23 10:39:37.563 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.785049) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777785130, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1275, "num_deletes": 251, "total_data_size": 3246737, "memory_usage": 3285344, "flush_reason": "Manual Compaction"}
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777801585, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 2102623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40738, "largest_seqno": 42008, "table_properties": {"data_size": 2096978, "index_size": 3039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11955, "raw_average_key_size": 19, "raw_value_size": 2085740, "raw_average_value_size": 3482, "num_data_blocks": 131, "num_entries": 599, "num_filter_entries": 599, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164669, "oldest_key_time": 1769164669, "file_creation_time": 1769164777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 16624 microseconds, and 7735 cpu microseconds.
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.801674) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 2102623 bytes OK
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.801711) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.803773) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.803872) EVENT_LOG_v1 {"time_micros": 1769164777803851, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.803919) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3240717, prev total WAL file size 3240717, number of live WAL files 2.
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.805635) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(2053KB)], [78(14MB)]
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777805716, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 17263759, "oldest_snapshot_seqno": -1}
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 7025 keys, 14949781 bytes, temperature: kUnknown
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777910804, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 14949781, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14903190, "index_size": 27919, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17605, "raw_key_size": 185241, "raw_average_key_size": 26, "raw_value_size": 14776802, "raw_average_value_size": 2103, "num_data_blocks": 1086, "num_entries": 7025, "num_filter_entries": 7025, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.911245) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 14949781 bytes
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.914471) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.1 rd, 142.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 14.5 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(15.3) write-amplify(7.1) OK, records in: 7541, records dropped: 516 output_compression: NoCompression
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.914563) EVENT_LOG_v1 {"time_micros": 1769164777914539, "job": 48, "event": "compaction_finished", "compaction_time_micros": 105232, "compaction_time_cpu_micros": 36066, "output_level": 6, "num_output_files": 1, "total_output_size": 14949781, "num_input_records": 7541, "num_output_records": 7025, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777915422, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777918744, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.805466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918871) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918877) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918879) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:37 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918883) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:38.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:38.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:39 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:40.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:40.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:41 np0005593294 nova_compute[225705]: 2026-01-23 10:39:41.029 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:42.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:42.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:42 np0005593294 nova_compute[225705]: 2026-01-23 10:39:42.564 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:44.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:44.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:44 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:46 np0005593294 nova_compute[225705]: 2026-01-23 10:39:46.033 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:46.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:46.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:47 np0005593294 nova_compute[225705]: 2026-01-23 10:39:47.565 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:48.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:48.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:49 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:50.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:50.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:50 np0005593294 podman[249841]: 2026-01-23 10:39:50.672871711 +0000 UTC m=+0.069648011 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 05:39:51 np0005593294 nova_compute[225705]: 2026-01-23 10:39:51.037 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:39:51 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3004.3 total, 600.0 interval#012Cumulative writes: 14K writes, 51K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s#012Cumulative WAL: 14K writes, 4378 syncs, 3.33 writes per sync, written: 0.04 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 764 writes, 1182 keys, 764 commit groups, 1.0 writes per commit group, ingest: 0.38 MB, 0.00 MB/s#012Interval WAL: 764 writes, 370 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:39:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:52.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:52.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:52 np0005593294 nova_compute[225705]: 2026-01-23 10:39:52.567 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:54.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:54.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:54 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:39:55.074 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:39:55.075 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:39:55.075 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:56 np0005593294 nova_compute[225705]: 2026-01-23 10:39:56.041 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:56.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:56.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:39:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:39:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:39:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:39:57 np0005593294 nova_compute[225705]: 2026-01-23 10:39:57.569 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:58.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:39:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:58.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:59 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:00.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:00.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:00 np0005593294 ceph-mon[80126]: overall HEALTH_WARN 2 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Jan 23 05:40:01 np0005593294 nova_compute[225705]: 2026-01-23 10:40:01.046 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:02.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:02.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:02 np0005593294 nova_compute[225705]: 2026-01-23 10:40:02.577 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:04.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:04.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:04 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:06 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:06 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:06 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:40:06 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:06 np0005593294 nova_compute[225705]: 2026-01-23 10:40:06.049 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:06.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:06.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:07 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:07 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:40:07 np0005593294 nova_compute[225705]: 2026-01-23 10:40:07.579 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:07 np0005593294 podman[249978]: 2026-01-23 10:40:07.722532183 +0000 UTC m=+0.120878362 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:40:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:08.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:08.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:10.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:10.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:11 np0005593294 nova_compute[225705]: 2026-01-23 10:40:11.083 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:11 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:11 np0005593294 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:12.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:12.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:12 np0005593294 nova_compute[225705]: 2026-01-23 10:40:12.582 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:14.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:14 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:14 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:14 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:14.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:14 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:15 np0005593294 nova_compute[225705]: 2026-01-23 10:40:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:15 np0005593294 nova_compute[225705]: 2026-01-23 10:40:15.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:40:15 np0005593294 nova_compute[225705]: 2026-01-23 10:40:15.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:40:15 np0005593294 nova_compute[225705]: 2026-01-23 10:40:15.890 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:40:16 np0005593294 nova_compute[225705]: 2026-01-23 10:40:16.127 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:16.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:16 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:16 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:16 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:16.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:17 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:17 np0005593294 nova_compute[225705]: 2026-01-23 10:40:17.584 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:18.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:18 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:18 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:18 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:18.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:19 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:19 np0005593294 nova_compute[225705]: 2026-01-23 10:40:19.884 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:20.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:20 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:20 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:20 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:20.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:21 np0005593294 nova_compute[225705]: 2026-01-23 10:40:21.131 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:21 np0005593294 podman[250061]: 2026-01-23 10:40:21.654822308 +0000 UTC m=+0.055019920 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:40:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:22 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:22.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:22 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:22 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:22 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:22.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:22 np0005593294 nova_compute[225705]: 2026-01-23 10:40:22.586 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:22 np0005593294 nova_compute[225705]: 2026-01-23 10:40:22.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:23 np0005593294 nova_compute[225705]: 2026-01-23 10:40:23.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:23 np0005593294 nova_compute[225705]: 2026-01-23 10:40:23.900 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:23 np0005593294 nova_compute[225705]: 2026-01-23 10:40:23.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:23 np0005593294 nova_compute[225705]: 2026-01-23 10:40:23.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:23 np0005593294 nova_compute[225705]: 2026-01-23 10:40:23.902 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:40:23 np0005593294 nova_compute[225705]: 2026-01-23 10:40:23.902 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:24.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:24 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:40:24 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1646194779' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:40:24 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:24 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:24 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:24.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:24 np0005593294 nova_compute[225705]: 2026-01-23 10:40:24.405 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:24 np0005593294 nova_compute[225705]: 2026-01-23 10:40:24.584 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:40:24 np0005593294 nova_compute[225705]: 2026-01-23 10:40:24.585 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4849MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:40:24 np0005593294 nova_compute[225705]: 2026-01-23 10:40:24.585 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:24 np0005593294 nova_compute[225705]: 2026-01-23 10:40:24.586 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:24 np0005593294 nova_compute[225705]: 2026-01-23 10:40:24.837 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:40:24 np0005593294 nova_compute[225705]: 2026-01-23 10:40:24.837 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:40:24 np0005593294 nova_compute[225705]: 2026-01-23 10:40:24.857 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:24 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.260092) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825260160, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 710, "num_deletes": 251, "total_data_size": 1512488, "memory_usage": 1528584, "flush_reason": "Manual Compaction"}
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825281077, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 689452, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42013, "largest_seqno": 42718, "table_properties": {"data_size": 686359, "index_size": 1001, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8261, "raw_average_key_size": 20, "raw_value_size": 679885, "raw_average_value_size": 1712, "num_data_blocks": 43, "num_entries": 397, "num_filter_entries": 397, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164778, "oldest_key_time": 1769164778, "file_creation_time": 1769164825, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 21088 microseconds, and 3957 cpu microseconds.
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.281168) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 689452 bytes OK
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.281213) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.283325) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.283417) EVENT_LOG_v1 {"time_micros": 1769164825283398, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.283462) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 1508683, prev total WAL file size 1508683, number of live WAL files 2.
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.284768) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323535' seq:72057594037927935, type:22 .. '6D6772737461740031353037' seq:0, type:0; will stop at (end)
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(673KB)], [81(14MB)]
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825284859, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 15639233, "oldest_snapshot_seqno": -1}
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6923 keys, 11743705 bytes, temperature: kUnknown
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825368046, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 11743705, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11702287, "index_size": 23002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 183324, "raw_average_key_size": 26, "raw_value_size": 11582026, "raw_average_value_size": 1672, "num_data_blocks": 892, "num_entries": 6923, "num_filter_entries": 6923, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164825, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.368380) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 11743705 bytes
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.370155) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.8 rd, 141.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 14.3 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(39.7) write-amplify(17.0) OK, records in: 7422, records dropped: 499 output_compression: NoCompression
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.370187) EVENT_LOG_v1 {"time_micros": 1769164825370173, "job": 50, "event": "compaction_finished", "compaction_time_micros": 83286, "compaction_time_cpu_micros": 27282, "output_level": 6, "num_output_files": 1, "total_output_size": 11743705, "num_input_records": 7422, "num_output_records": 6923, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825370583, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825375102, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.284544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375246) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375253) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375254) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375255) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:40:25 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4269113456' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:40:25 np0005593294 nova_compute[225705]: 2026-01-23 10:40:25.425 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.568s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:25 np0005593294 nova_compute[225705]: 2026-01-23 10:40:25.432 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:40:25 np0005593294 nova_compute[225705]: 2026-01-23 10:40:25.446 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:40:25 np0005593294 nova_compute[225705]: 2026-01-23 10:40:25.448 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:40:25 np0005593294 nova_compute[225705]: 2026-01-23 10:40:25.448 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:26 np0005593294 nova_compute[225705]: 2026-01-23 10:40:26.134 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:40:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:26.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:40:26 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:26 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:26 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:26.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:27 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:27 np0005593294 nova_compute[225705]: 2026-01-23 10:40:27.449 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:27 np0005593294 nova_compute[225705]: 2026-01-23 10:40:27.450 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:27 np0005593294 nova_compute[225705]: 2026-01-23 10:40:27.587 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:27 np0005593294 nova_compute[225705]: 2026-01-23 10:40:27.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:28.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:28 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:28 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:28 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:28.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:28 np0005593294 nova_compute[225705]: 2026-01-23 10:40:28.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:29 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:29 np0005593294 nova_compute[225705]: 2026-01-23 10:40:29.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:29 np0005593294 nova_compute[225705]: 2026-01-23 10:40:29.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:40:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:40:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:30.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:40:30 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:30 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:40:30 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:30.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:40:31 np0005593294 nova_compute[225705]: 2026-01-23 10:40:31.138 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:32 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:40:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:32.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:40:32 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:32 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:32 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:32.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:32 np0005593294 nova_compute[225705]: 2026-01-23 10:40:32.589 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:40:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:34.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:40:34 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:34 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:34 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:34.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:34 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:36 np0005593294 nova_compute[225705]: 2026-01-23 10:40:36.142 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:36.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:36 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:36 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:40:36 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:36.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:40:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:37 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:37 np0005593294 nova_compute[225705]: 2026-01-23 10:40:37.591 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:38.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:38 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:38 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:38 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:38.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:38 np0005593294 podman[250157]: 2026-01-23 10:40:38.703008391 +0000 UTC m=+0.101754453 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:40:39 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:40.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:40 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:40 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:40 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:40.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:41 np0005593294 nova_compute[225705]: 2026-01-23 10:40:41.146 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:42 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:42.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:42 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:42 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:42 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:42.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:42 np0005593294 nova_compute[225705]: 2026-01-23 10:40:42.593 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:43 np0005593294 systemd-logind[807]: New session 58 of user zuul.
Jan 23 05:40:43 np0005593294 systemd[1]: Started Session 58 of User zuul.
Jan 23 05:40:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:44.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:44 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:44 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:44 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:44.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:44 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:46 np0005593294 nova_compute[225705]: 2026-01-23 10:40:46.150 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:46.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:46 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:46 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:40:46 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:46.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:40:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:47 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:40:47 np0005593294 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8038 writes, 42K keys, 8038 commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 8038 writes, 8038 syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1508 writes, 7661 keys, 1508 commit groups, 1.0 writes per commit group, ingest: 17.28 MB, 0.03 MB/s#012Interval WAL: 1508 writes, 1508 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     39.5      1.53              0.19        25    0.061       0      0       0.0       0.0#012  L6      1/0   11.20 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   5.0    102.4     88.0      3.44              0.88        24    0.143    146K    13K       0.0       0.0#012 Sum      1/0   11.20 MB   0.0      0.3     0.1      0.3       0.4      0.1       0.0   6.0     70.8     73.1      4.97              1.07        49    0.101    146K    13K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.7     50.2     48.9      2.07              0.32        14    0.148     51K   4033       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0    102.4     88.0      3.44              0.88        24    0.143    146K    13K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     39.6      1.53              0.19        24    0.064       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.059, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.35 GB write, 0.12 MB/s write, 0.34 GB read, 0.12 MB/s read, 5.0 seconds#012Interval compaction: 0.10 GB write, 0.17 MB/s write, 0.10 GB read, 0.17 MB/s read, 2.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 31.84 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000337 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1898,30.76 MB,10.1171%) FilterBlock(49,427.73 KB,0.137404%) IndexBlock(49,682.17 KB,0.219139%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:40:47 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 05:40:47 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/390486485' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 05:40:47 np0005593294 nova_compute[225705]: 2026-01-23 10:40:47.646 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:48.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:48 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:48 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:48 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:48.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:49 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:50.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:50 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:50 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:50 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:50.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:51 np0005593294 nova_compute[225705]: 2026-01-23 10:40:51.153 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:51 np0005593294 ovs-vsctl[250513]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 05:40:51 np0005593294 podman[250575]: 2026-01-23 10:40:51.826321022 +0000 UTC m=+0.069907815 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 05:40:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:52 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:52.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:52 np0005593294 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 05:40:52 np0005593294 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 05:40:52 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:52 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:52 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:52.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:52 np0005593294 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 05:40:52 np0005593294 nova_compute[225705]: 2026-01-23 10:40:52.647 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:52 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: cache status {prefix=cache status} (starting...)
Jan 23 05:40:52 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:53 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: client ls {prefix=client ls} (starting...)
Jan 23 05:40:53 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:53 np0005593294 lvm[250910]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 05:40:53 np0005593294 lvm[250910]: VG ceph_vg0 finished
Jan 23 05:40:53 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 05:40:53 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:54 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 23 05:40:54 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1569340402' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:54.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:54 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:54 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:54 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:54.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:54 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 23 05:40:54 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2991488001' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 05:40:54 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:55 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 23 05:40:55 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2517041095' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 05:40:55 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:40:55.076 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:40:55.078 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:55 np0005593294 ovn_metadata_agent[143093]: 2026-01-23 10:40:55.078 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:55 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: ops {prefix=ops} (starting...)
Jan 23 05:40:55 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:55 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 23 05:40:55 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2494977871' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 05:40:55 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 23 05:40:55 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/758304118' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 05:40:56 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: session ls {prefix=session ls} (starting...)
Jan 23 05:40:56 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 05:40:56 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 05:40:56 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3865666164' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 05:40:56 np0005593294 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: status {prefix=status} (starting...)
Jan 23 05:40:56 np0005593294 nova_compute[225705]: 2026-01-23 10:40:56.157 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:40:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:56.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:40:56 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:56 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:56 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:56.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:56 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 05:40:56 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1456795587' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 05:40:56 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 23 05:40:56 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/883122861' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 05:40:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:40:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:40:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:40:57 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:40:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 05:40:57 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/365728033' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 05:40:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 23 05:40:57 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3781180657' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 05:40:57 np0005593294 nova_compute[225705]: 2026-01-23 10:40:57.649 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:57 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 05:40:57 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/823562891' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 05:40:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 23 05:40:58 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3408741616' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 05:40:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:58.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 23 05:40:58 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/458587115' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 05:40:58 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:40:58 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:40:58 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:58.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:40:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 05:40:58 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/445151465' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 05:40:58 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 23 05:40:58 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3276688100' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 05:40:59 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 05:40:59 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1212026481' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 05:40:59 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 05:40:59 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1315781543' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 05:41:00 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022c000 session 0x55a5637525a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64c00 session 0x55a5637523c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.582626343s of 49.590423584s, submitted: 1
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990135 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991647 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 992568 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.072285652s of 12.089330673s, submitted: 4
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56049b000 session 0x55a560b463c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1204.3 total, 600.0 interval#012Cumulative writes: 9060 writes, 35K keys, 9060 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 9060 writes, 1959 syncs, 4.62 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 781 writes, 1248 keys, 781 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s#012Interval WAL: 781 writes, 366 syncs, 2.13 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1204.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1204.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1204.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022d400 session 0x55a560ef6780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d65000 session 0x55a5601c9c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 95.210830688s of 95.219345093s, submitted: 2
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995001 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a560c23a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a560b46780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d56000 session 0x55a56370c000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64000 session 0x55a560223860
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83722240 unmapped: 122880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83722240 unmapped: 122880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 99.385841370s of 100.670951843s, submitted: 7
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83787776 unmapped: 57344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,1,0,1])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83861504 unmapped: 1032192 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83869696 unmapped: 1024000 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995535 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,3])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84934656 unmapped: 1007616 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84942848 unmapped: 999424 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995463 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 3.510344267s of 10.329211235s, submitted: 399
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994281 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d67400 session 0x55a5602190e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56049b000 session 0x55a562f18960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.042835236s of 40.052230835s, submitted: 3
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994149 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995661 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995661 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.938447952s of 13.945899963s, submitted: 2
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 120.149322510s of 120.153068542s, submitted: 1
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 999295 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 1671168 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 1622016 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 141 ms_handle_reset con 0x55a56022d800 session 0x55a560b46000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 1622016 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fc5e9000/0x0/0x4ffc00000, data 0x161cf8/0x221000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 141 ms_handle_reset con 0x55a56049b000 session 0x55a563595a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1144302 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb179000/0x0/0x4ffc00000, data 0x15d1cf8/0x1691000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a560b61400 session 0x55a563594000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a55fd2bc00 session 0x55a562f1fc20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56022d400 session 0x55a56387fa40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 30.783803940s of 31.204965591s, submitted: 49
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150286 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d65000 session 0x55a560e963c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d64c00 session 0x55a5627854a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150194 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb175000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56049b000 session 0x55a560a07a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a560b61400 session 0x55a560b472c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d64000 session 0x55a560e912c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.051367760s of 11.062813759s, submitted: 3
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d67400 session 0x55a560e96f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56022c400 session 0x55a563594b40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85614592 unmapped: 18161664 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56049b000 session 0x55a5635941e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85639168 unmapped: 18137088 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87097344 unmapped: 16678912 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a560b61400 session 0x55a562e5e960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d64000 session 0x55a56387e5a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d67400 session 0x55a5637521e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561cd6000 session 0x55a560219e00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1187745 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a56049b000 session 0x55a560e914a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87097344 unmapped: 16678912 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87138304 unmapped: 16637952 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d64000 session 0x55a56387fc20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1186624 data_alloc: 218103808 data_used: 303104
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87146496 unmapped: 16629760 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89415680 unmapped: 14360576 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89473024 unmapped: 14303232 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.645101547s of 10.825790405s, submitted: 53
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89497600 unmapped: 14278656 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89497600 unmapped: 14278656 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209371 data_alloc: 218103808 data_used: 3031040
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209371 data_alloc: 218103808 data_used: 3031040
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.822414398s of 10.054231644s, submitted: 18
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93847552 unmapped: 9928704 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93847552 unmapped: 9928704 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa929000/0x0/0x4ffc00000, data 0x1e13078/0x1edb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253507 data_alloc: 218103808 data_used: 3325952
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92520448 unmapped: 11255808 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92520448 unmapped: 11255808 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253507 data_alloc: 218103808 data_used: 3325952
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a56327b0e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3dc00 session 0x55a563595e00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a562f192c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92602368 unmapped: 11173888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.268079758s of 28.147586823s, submitted: 54
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a56021e780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef6d20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92594176 unmapped: 11182080 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a562785680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a55f9194a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3dc00 session 0x55a562e5fe00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562e5f2c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a5602214a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a560222f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93290496 unmapped: 20987904 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93290496 unmapped: 20987904 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93298688 unmapped: 20979712 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.620344162s of 17.155050278s, submitted: 29
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a560b99c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 94502912 unmapped: 19775488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100663296 unmapped: 13615104 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413079 data_alloc: 234881024 data_used: 13975552
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413079 data_alloc: 234881024 data_used: 13975552
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104775680 unmapped: 9502720 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.021077156s of 12.051360130s, submitted: 8
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1439531 data_alloc: 234881024 data_used: 14000128
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 9658368 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112254976 unmapped: 8896512 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4e000/0x0/0x4ffc00000, data 0x3655088/0x371e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1517819 data_alloc: 234881024 data_used: 14286848
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4c000/0x0/0x4ffc00000, data 0x3657088/0x3720000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112541696 unmapped: 8609792 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.255970955s of 10.148886681s, submitted: 105
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc1800 session 0x55a562e563c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a56370c780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1513215 data_alloc: 234881024 data_used: 14286848
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4c000/0x0/0x4ffc00000, data 0x3657088/0x3720000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103571456 unmapped: 17580032 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a563595680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267213 data_alloc: 218103808 data_used: 3334144
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f978f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267213 data_alloc: 218103808 data_used: 3334144
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.262123108s of 10.403896332s, submitted: 50
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560c245a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5602192c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103260160 unmapped: 17891328 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f978f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,1])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a56103c5a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560ef61e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562f1f680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184030 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184030 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.069961548s of 14.267497063s, submitted: 33
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184162 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184162 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a55fee6f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5601c9c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a5601c9680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.291867256s of 10.369996071s, submitted: 3
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a56327b860
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184631 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560a9fa40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a55fee7a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100130816 unmapped: 21020672 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a560219a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a5630701e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562e56780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a5630d8f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560ef6780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5630dad20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aef000/0x0/0x4ffc00000, data 0x1ab6ff3/0x1b7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54000 session 0x55a56103d4a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1226346 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560eef2c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560eefe00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99868672 unmapped: 29679616 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99893248 unmapped: 29655040 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262160 data_alloc: 218103808 data_used: 5316608
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262160 data_alloc: 218103808 data_used: 5316608
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101187584 unmapped: 28360704 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101187584 unmapped: 28360704 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.546638489s of 18.645618439s, submitted: 27
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102637568 unmapped: 26910720 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562e561e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560b990e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105865216 unmapped: 23683072 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300302 data_alloc: 218103808 data_used: 6565888
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105865216 unmapped: 23683072 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9635000/0x0/0x4ffc00000, data 0x1f6a003/0x2031000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,10])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 106921984 unmapped: 22626304 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 22519808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 22519808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310702 data_alloc: 218103808 data_used: 6471680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.670749664s of 11.247441292s, submitted: 64
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107118592 unmapped: 22429696 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107118592 unmapped: 22429696 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107151360 unmapped: 22396928 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107151360 unmapped: 22396928 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.062977791s of 12.069572449s, submitted: 1
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302567 data_alloc: 218103808 data_used: 6483968
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560eef0e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5603c8b40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 27033600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a562f1e5a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563070780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5630703c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a563070f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5630705a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.487091064s of 36.540157318s, submitted: 34
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560ef61e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef63c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a560ef7e00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c55800 session 0x55a560ef6960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560ef6000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1275342 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276816 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562c57c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101695488 unmapped: 31531008 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102670336 unmapped: 30556160 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1804.3 total, 600.0 interval#012Cumulative writes: 10K writes, 39K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 10K writes, 2684 syncs, 4.00 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1689 writes, 4700 keys, 1689 commit groups, 1.0 writes per commit group, ingest: 4.62 MB, 0.01 MB/s#012Interval WAL: 1689 writes, 725 syncs, 2.33 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350139 data_alloc: 234881024 data_used: 11247616
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350139 data_alloc: 234881024 data_used: 11247616
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350291 data_alloc: 234881024 data_used: 11251712
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.624202728s of 23.833007812s, submitted: 21
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112435200 unmapped: 20791296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112435200 unmapped: 20791296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111206400 unmapped: 22020096 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412875 data_alloc: 234881024 data_used: 11755520
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8cba000/0x0/0x4ffc00000, data 0x28ecfe3/0x29b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,1])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111697920 unmapped: 21528576 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113246208 unmapped: 19980288 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113278976 unmapped: 19947520 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113278976 unmapped: 19947520 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113319936 unmapped: 19906560 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1433231 data_alloc: 234881024 data_used: 12496896
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113319936 unmapped: 19906560 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1428407 data_alloc: 234881024 data_used: 12496896
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.531507492s of 14.888542175s, submitted: 102
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29a3fe3/0x2a69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29a3fe3/0x2a69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1428511 data_alloc: 234881024 data_used: 12496896
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29a4fe3/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29a4fe3/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120709120 unmapped: 12517376 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212800 session 0x55a560e90d20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560c22000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd9000 session 0x55a560b47860
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1484761 data_alloc: 234881024 data_used: 12496896
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560c24000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5630710e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560b463c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212800 session 0x55a562785a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214800 session 0x55a55fee70e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.321987152s of 11.643644333s, submitted: 14
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a563594960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f84b7000/0x0/0x4ffc00000, data 0x30effe3/0x31b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 19611648 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 19611648 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1503794 data_alloc: 234881024 data_used: 14667776
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 14409728 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532370 data_alloc: 234881024 data_used: 18919424
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 14409728 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532658 data_alloc: 234881024 data_used: 18923520
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.284761429s of 12.309606552s, submitted: 7
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122421248 unmapped: 10805248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122421248 unmapped: 10805248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121954304 unmapped: 11272192 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649774 data_alloc: 234881024 data_used: 19283968
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72cf000/0x0/0x4ffc00000, data 0x3ec7006/0x3f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72cf000/0x0/0x4ffc00000, data 0x3ec7006/0x3f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1644310 data_alloc: 234881024 data_used: 19283968
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a55fee7c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a56103c000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a5603c9a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d000 session 0x55a5601c8000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.608616829s of 10.889985085s, submitted: 103
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5601cbe00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87ee000/0x0/0x4ffc00000, data 0x29a7fe3/0x2a6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442224 data_alloc: 234881024 data_used: 12496896
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a563070960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55fee6f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a561048d20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217850 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.357207298s of 10.545221329s, submitted: 64
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1219494 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221006 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.268618584s of 12.279949188s, submitted: 3
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.385444641s of 16.389841080s, submitted: 1
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5623ae400 session 0x55a560ef7a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5635954a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5630714a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5630705a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5630703c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286318 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107503616 unmapped: 33071104 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107511808 unmapped: 33062912 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a560eeed20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107511808 unmapped: 33062912 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 106987520 unmapped: 33587200 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1288671 data_alloc: 218103808 data_used: 393216
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108208128 unmapped: 32366592 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347647 data_alloc: 218103808 data_used: 8814592
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a560eef4a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347647 data_alloc: 218103808 data_used: 8814592
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.396499634s of 20.359004974s, submitted: 48
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111820800 unmapped: 28753920 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115318784 unmapped: 25255936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c5f000/0x0/0x4ffc00000, data 0x2537045/0x25fd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 25124864 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c54000/0x0/0x4ffc00000, data 0x2541045/0x2607000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409157 data_alloc: 218103808 data_used: 9031680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bd3000/0x0/0x4ffc00000, data 0x25c3045/0x2689000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 27574272 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 27574272 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113139712 unmapped: 27435008 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 27426816 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bb2000/0x0/0x4ffc00000, data 0x25e4045/0x26aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408089 data_alloc: 218103808 data_used: 9035776
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 27418624 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 27418624 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bb2000/0x0/0x4ffc00000, data 0x25e4045/0x26aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.870034218s of 12.169149399s, submitted: 80
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113180672 unmapped: 27394048 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113180672 unmapped: 27394048 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411361 data_alloc: 218103808 data_used: 9035776
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411659 data_alloc: 218103808 data_used: 9043968
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 27279360 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412595 data_alloc: 218103808 data_used: 9043968
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.734275818s of 13.774451256s, submitted: 9
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55fee7c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a55fee6f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a561048d20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d62800 session 0x55a562c57c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114442240 unmapped: 26132480 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a562c574a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a560a070e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a563071a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a5603c9680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c55c00 session 0x55a560ef7c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 25993216 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x26ec06e/0x27b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432887 data_alloc: 218103808 data_used: 9048064
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 25985024 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 25960448 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x26ec0a7/0x27b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 25960448 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113704960 unmapped: 26869760 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a5603c9c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 26853376 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1434949 data_alloc: 218103808 data_used: 9048064
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d66000 session 0x55a560223a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x26ed0ca/0x27b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.734139442s of 11.903190613s, submitted: 57
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114343936 unmapped: 26230784 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114401280 unmapped: 26173440 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440168 data_alloc: 234881024 data_used: 9789440
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 25985024 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa3000/0x0/0x4ffc00000, data 0x26f10ca/0x27b9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [0,0,1])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114794496 unmapped: 25780224 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa3000/0x0/0x4ffc00000, data 0x26f10ca/0x27b9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440509 data_alloc: 234881024 data_used: 9789440
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116137984 unmapped: 24436736 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f840e000/0x0/0x4ffc00000, data 0x2d700ca/0x2e38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1504109 data_alloc: 234881024 data_used: 9850880
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a560c25680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef65a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117350400 unmapped: 23224320 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.155679703s of 12.892781258s, submitted: 488
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116834304 unmapped: 23740416 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116834304 unmapped: 23740416 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8402000/0x0/0x4ffc00000, data 0x2d920ca/0x2e5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1496421 data_alloc: 234881024 data_used: 9854976
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8402000/0x0/0x4ffc00000, data 0x2d920ca/0x2e5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116965376 unmapped: 23609344 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116965376 unmapped: 23609344 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f83f3000/0x0/0x4ffc00000, data 0x2da10ca/0x2e69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 23527424 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1496237 data_alloc: 234881024 data_used: 9854976
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 23527424 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.086823463s of 10.129245758s, submitted: 9
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a5601c8780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55f9181e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116473856 unmapped: 24100864 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560c22f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1426840 data_alloc: 218103808 data_used: 9109504
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1426840 data_alloc: 218103808 data_used: 9109504
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b84000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.182063103s of 12.550888062s, submitted: 72
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1424015 data_alloc: 218103808 data_used: 9109504
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5601ca780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560c24f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b84000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 29704192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a563071680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2bc00 session 0x55a5627841e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a562c56000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 30.355859756s of 30.541212082s, submitted: 59
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 29704192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245655 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110878720 unmapped: 29696000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110878720 unmapped: 29696000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562e57680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a56327a5a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a560a07a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc3800 session 0x55a5603c94a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560a072c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562785e00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a560ef6000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a56370cb40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc3800 session 0x55a560219860
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339283 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112795648 unmapped: 31449088 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560e914a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112803840 unmapped: 31440896 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 31277056 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [1])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1418779 data_alloc: 234881024 data_used: 12140544
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560b465a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a5635954a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.497446060s of 13.612625122s, submitted: 20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a56021fe00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 33759232 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 33759232 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.943146706s of 13.990984917s, submitted: 17
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022dc00 session 0x55a56387e780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562c570e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5602230e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a563752d20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a562f19e00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 33554432 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 33554432 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9819000/0x0/0x4ffc00000, data 0x197d045/0x1a43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1289281 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563d89000 session 0x55a560219a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a562e56f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a562f1f680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563752000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560c24000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a5603c9a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 32890880 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298244 data_alloc: 218103808 data_used: 1359872
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563070960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.371298790s of 11.489388466s, submitted: 37
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a56021fa40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110919680 unmapped: 33325056 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a56017a960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 33308672 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: mgrc ms_handle_reset ms_handle_reset con 0x55a561cb3c00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/4198923246
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/4198923246,v1:192.168.122.100:6801/4198923246]
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: mgrc handle_mgr_configure stats_period=5
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d68c00 session 0x55a562e5f680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56016f400 session 0x55a562ed23c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a55f6d90e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.596210480s of 16.945894241s, submitted: 37
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260315 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261695 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261695 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111124480 unmapped: 33120256 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111124480 unmapped: 33120256 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.821660995s of 13.956790924s, submitted: 3
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261563 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261563 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58400 session 0x55a5601ca3c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560b46780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a55f919860
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562c563c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a56370cd20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563d89800 session 0x55a5630dad20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303917 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96ee000/0x0/0x4ffc00000, data 0x1aa8045/0x1b6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 32940032 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560a9fa40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5601ca3c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.447840691s of 13.576416016s, submitted: 37
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111329280 unmapped: 32915456 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5601caf00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307891 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110780416 unmapped: 33464320 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342547 data_alloc: 218103808 data_used: 5341184
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342547 data_alloc: 218103808 data_used: 5341184
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111091712 unmapped: 33153024 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111099904 unmapped: 33144832 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.020989418s of 15.036432266s, submitted: 3
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112001024 unmapped: 32243712 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115662848 unmapped: 28581888 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.164536476s of 14.331671715s, submitted: 56
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a5601cba40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc2000 session 0x55a5601c81e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a560219e00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a56021f860
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a562ed21e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.203777313s of 22.330352783s, submitted: 42
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213c00 session 0x55a5601ca960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562215800 session 0x55a562ed2b40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a55fee70e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560220000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a562e5f2c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9689000/0x0/0x4ffc00000, data 0x1b0dfe3/0x1bd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343500 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202800 session 0x55a562784780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111263744 unmapped: 32980992 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343804 data_alloc: 218103808 data_used: 339968
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111198208 unmapped: 33046528 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399892 data_alloc: 218103808 data_used: 8769536
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.868194580s of 11.951797485s, submitted: 14
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399760 data_alloc: 218103808 data_used: 8769536
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 28647424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 22994944 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 22970368 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d6bc00 session 0x55a563071680
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a563070960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560c881e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a560c883c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 119250944 unmapped: 24993792 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202800 session 0x55a5601ca780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56800 session 0x55a560ef6960
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a561048d20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a5610485a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2b000 session 0x55a5610492c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1586678 data_alloc: 234881024 data_used: 9949184
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 31072256 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213800 session 0x55a55fee6f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.895648956s of 14.161753654s, submitted: 94
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56257a400 session 0x55a55fee7a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1586694 data_alloc: 234881024 data_used: 9949184
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2b000 session 0x55a55fee6d20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a55fee7c20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 27189248 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1666876 data_alloc: 234881024 data_used: 21913600
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1666876 data_alloc: 234881024 data_used: 21913600
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.947762489s of 13.955580711s, submitted: 2
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134848512 unmapped: 17276928 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f777a000/0x0/0x4ffc00000, data 0x360bff3/0x36d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6b8d000/0x0/0x4ffc00000, data 0x41f0ff3/0x42b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759978 data_alloc: 234881024 data_used: 22151168
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134922240 unmapped: 17203200 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6af2000/0x0/0x4ffc00000, data 0x4293ff3/0x435a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134946816 unmapped: 17178624 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1771750 data_alloc: 234881024 data_used: 22212608
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134946816 unmapped: 17178624 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135184384 unmapped: 16941056 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ace000/0x0/0x4ffc00000, data 0x42b7ff3/0x437e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135217152 unmapped: 16908288 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135217152 unmapped: 16908288 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135225344 unmapped: 16900096 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1770406 data_alloc: 234881024 data_used: 22220800
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135225344 unmapped: 16900096 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ace000/0x0/0x4ffc00000, data 0x42b7ff3/0x437e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135258112 unmapped: 16867328 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135258112 unmapped: 16867328 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.216358185s of 14.454858780s, submitted: 91
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135315456 unmapped: 16809984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135315456 unmapped: 16809984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1770158 data_alloc: 234881024 data_used: 22220800
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a5603c92c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213800 session 0x55a560ef7a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135340032 unmapped: 16785408 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135340032 unmapped: 16785408 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1512596 data_alloc: 234881024 data_used: 9957376
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8392000/0x0/0x4ffc00000, data 0x29f3ff3/0x2aba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55f6c5400 session 0x55a561048b40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255c400 session 0x55a560ef72c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563610800 session 0x55a560c24f00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.984561920s of 24.057754517s, submitted: 24
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255cc00 session 0x55a560219860
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.494945526s of 40.619098663s, submitted: 20
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120668160 unmapped: 31457280 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120283136 unmapped: 31842304 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214c00 session 0x55a560ef7860
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5631e0c00 session 0x55a562c565a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214c00 session 0x55a560218000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1358694 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255c400 session 0x55a560219e00
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255cc00 session 0x55a560ef65a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357294 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.344947815s of 10.433979988s, submitted: 23
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5631e0c00 session 0x55a563071a40
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359231 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120242176 unmapped: 31883264 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408783 data_alloc: 218103808 data_used: 7593984
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408783 data_alloc: 218103808 data_used: 7593984
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.895034790s of 14.912478447s, submitted: 5
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123174912 unmapped: 28950528 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1462949 data_alloc: 218103808 data_used: 7610368
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f895c000/0x0/0x4ffc00000, data 0x242a006/0x24f0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1467605 data_alloc: 218103808 data_used: 7610368
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8863000/0x0/0x4ffc00000, data 0x2523006/0x25e9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.162461281s of 11.450368881s, submitted: 61
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126320640 unmapped: 25804800 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1473525 data_alloc: 218103808 data_used: 7856128
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x25a0006/0x2666000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x25a0006/0x2666000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1476015 data_alloc: 218103808 data_used: 7860224
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87c4000/0x0/0x4ffc00000, data 0x25c2006/0x2688000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.021368027s of 10.088058472s, submitted: 14
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202c00 session 0x55a5602214a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c53400 session 0x55a560a9f0e0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 28925952 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308597 data_alloc: 218103808 data_used: 311296
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 28925952 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562215c00 session 0x55a5630703c0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2404.3 total, 600.0 interval#012Cumulative writes: 13K writes, 49K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 13K writes, 4008 syncs, 3.45 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3081 writes, 10K keys, 3081 commit groups, 1.0 writes per commit group, ingest: 9.92 MB, 0.02 MB/s#012Interval WAL: 3081 writes, 1324 syncs, 2.33 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122044416 unmapped: 30081024 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'config show' '{prefix=config show}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122052608 unmapped: 30072832 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 30474240 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 30359552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'log dump' '{prefix=log dump}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'perf dump' '{prefix=perf dump}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 41041920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'perf schema' '{prefix=perf schema}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 41541632 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 41541632 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 41541632 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:00.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 41492480 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 41492480 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 41492480 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 234.130950928s of 234.467590332s, submitted: 41
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 41451520 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [0,1,0,1])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 41410560 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120627200 unmapped: 42541056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307821 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120635392 unmapped: 42532864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 42516480 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,1,1])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120659968 unmapped: 42508288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120668160 unmapped: 42500096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120676352 unmapped: 42491904 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120692736 unmapped: 42475520 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120733696 unmapped: 42434560 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 5.367417336s of 10.023312569s, submitted: 306
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120766464 unmapped: 42401792 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120791040 unmapped: 42377216 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120799232 unmapped: 42369024 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120799232 unmapped: 42369024 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120807424 unmapped: 42360832 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120807424 unmapped: 42360832 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120807424 unmapped: 42360832 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120807424 unmapped: 42360832 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120815616 unmapped: 42352640 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120815616 unmapped: 42352640 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120815616 unmapped: 42352640 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120889344 unmapped: 42278912 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120889344 unmapped: 42278912 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a562e5f4a0
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121020416 unmapped: 42147840 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121020416 unmapped: 42147840 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121020416 unmapped: 42147840 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 42123264 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 42123264 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 42123264 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5621f9800 session 0x55a56370c780
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d43400 session 0x55a560b98000
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3004.3 total, 600.0 interval#012Cumulative writes: 14K writes, 51K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s#012Cumulative WAL: 14K writes, 4378 syncs, 3.33 writes per sync, written: 0.04 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 764 writes, 1182 keys, 764 commit groups, 1.0 writes per commit group, ingest: 0.38 MB, 0.00 MB/s#012Interval WAL: 764 writes, 370 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 42057728 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 42057728 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 42057728 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 42033152 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 42033152 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 42033152 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 42008576 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 42008576 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'config show' '{prefix=config show}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 41746432 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 41877504 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121364480 unmapped: 41803776 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 05:41:00 np0005593294 ceph-osd[77616]: do_command 'log dump' '{prefix=log dump}'
Jan 23 05:41:00 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:00 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:00 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:00.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:00 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 05:41:00 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/716455899' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 05:41:00 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 05:41:00 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/148398286' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 05:41:01 np0005593294 nova_compute[225705]: 2026-01-23 10:41:01.160 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:01 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 05:41:01 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1085532514' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 05:41:01 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 05:41:01 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/712991638' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 05:41:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:41:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:41:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:41:02 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:41:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:41:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:02.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:41:02 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:02 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:41:02 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:02.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:41:02 np0005593294 nova_compute[225705]: 2026-01-23 10:41:02.651 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:02 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 23 05:41:02 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2934036653' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 05:41:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 23 05:41:03 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2272500806' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 05:41:03 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 23 05:41:03 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3579800927' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 05:41:04 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 23 05:41:04 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2527938535' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 05:41:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:04.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:04 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 23 05:41:04 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3136483563' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 05:41:04 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:04 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:04 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:04.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:04 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 23 05:41:04 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4081738896' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 05:41:04 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 23 05:41:04 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2279399780' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 05:41:05 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:05 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 23 05:41:05 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1474099025' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 05:41:05 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 23 05:41:05 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/265346978' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 05:41:05 np0005593294 systemd[1]: Starting Hostname Service...
Jan 23 05:41:05 np0005593294 systemd[1]: Started Hostname Service.
Jan 23 05:41:06 np0005593294 nova_compute[225705]: 2026-01-23 10:41:06.164 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:06 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 23 05:41:06 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/142027931' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 05:41:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:41:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:06.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:41:06 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:06 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:06 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:06.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:06 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 23 05:41:06 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/516926078' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 05:41:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:41:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:41:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:41:07 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:41:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 23 05:41:07 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2337187949' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 05:41:07 np0005593294 nova_compute[225705]: 2026-01-23 10:41:07.653 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:07 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 23 05:41:07 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4288764517' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 05:41:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:41:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:08.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:41:08 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 23 05:41:08 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1961194654' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 05:41:08 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:08 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:08 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:08.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:41:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:41:09 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 23 05:41:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4171598296' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 05:41:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 05:41:09 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 05:41:09 np0005593294 podman[253097]: 2026-01-23 10:41:09.715732631 +0000 UTC m=+0.119304440 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:41:10 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:10 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 23 05:41:10 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3513160830' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 05:41:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:10.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:10 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:10 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:10 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:10.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:10 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:41:10 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:41:10 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 05:41:10 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2845937288' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 05:41:11 np0005593294 nova_compute[225705]: 2026-01-23 10:41:11.167 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:11 np0005593294 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 23 05:41:11 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2100362027' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 05:41:11 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 05:41:11 np0005593294 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 05:41:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:41:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:41:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:41:12 np0005593294 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:41:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:41:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:12.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:41:12 np0005593294 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 05:41:12 np0005593294 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:12 np0005593294 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:12.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:12 np0005593294 nova_compute[225705]: 2026-01-23 10:41:12.656 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
